load HCVEgyptData.mat
disp(HCV)
Age Gender BMI Fever NauseaVomting Headache Diarrhea Fatiguegeneralizedboneache Jaundice Epigastricpain WBC RBC HGB Plat AST1 ALT1 ALT4 ALT12 ALT24 ALT36 ALT48 ALTafter24w RNABase RNA4 RNA12 RNAEOT RNAEF BaselinehistologicalGrading Baselinehistologicalstaging ___ ______ ___ _____ _____________ ________ ________ __________________________ ________ ______________ _____ __________ ___ __________ ____ ____ ____ _____ _____ _____ _____ ___________ __________ __________ __________ __________ __________ ___________________________ ___________________________ 56 1 35 2 1 1 1 2 2 2 7425 4.2488e+06 14 1.1213e+05 99 84 52 109 81 5 5 5 6.5533e+05 6.3454e+05 2.8819e+05 5 5 13 2 46 1 29 1 2 2 1 2 2 1 12101 4.4294e+06 10 1.2937e+05 91 123 95 75 113 57 123 44 40620 5.3864e+05 6.3706e+05 3.368e+05 31085 4 2 57 1 33 2 2 2 2 1 1 1 4178 4.6212e+06 12 1.5152e+05 113 49 95 107 116 5 5 5 5.7115e+05 6.6135e+05 5 7.3595e+05 5.5883e+05 4 4 49 2 33 1 2 1 2 1 2 1 6490 4.7946e+06 10 1.4646e+05 43 64 109 80 88 48 77 33 1.0419e+06 4.4994e+05 5.8569e+05 7.4446e+05 5.823e+05 10 3 59 1 32 1 1 2 1 2 2 2 3661 4.6064e+06 11 1.8768e+05 99 104 67 48 120 94 90 30 6.6041e+05 7.3876e+05 3.7315e+06 3.3895e+05 2.4286e+05 11 1 58 2 22 2 2 2 1 2 2 1 11785 3.8825e+06 15 1.3123e+05 66 104 121 96 65 73 114 29 1.1575e+06 1.0869e+06 5 5 5 4 4 42 2 26 1 1 2 2 2 2 2 11620 4.7473e+06 12 1.7726e+05 78 57 113 118 107 84 80 28 3.2569e+05 1.034e+06 2.751e+05 2.1457e+05 6.3516e+05 12 4 48 2 30 1 1 2 2 1 1 2 7335 4.4059e+06 11 2.1618e+05 119 112 80 127 45 96 53 39 6.4113e+05 72050 7.873e+05 3.7061e+05 5.063e+05 12 3 44 1 23 1 1 2 2 2 1 2 10480 4.6085e+06 12 1.4889e+05 93 83 55 102 97 122 39 45 5.9144e+05 7.5736e+05 5 3.7109e+05 2.0304e+05 5 2 45 1 30 2 1 2 2 1 1 2 6681 4.4553e+06 12 98200 55 68 72 127 81 125 43 30 1.1512e+06 2.3049e+05 2.6732e+05 2.753e+05 5.5552e+05 4 2 37 2 24 2 1 2 1 2 2 1 4437 4.265e+06 12 1.6603e+05 103 124 111 74 53 123 101 33 1.0231e+06 1.0319e+05 7.3193e+05 4.4847e+05 59998 15 2 36 1 22 2 2 1 1 1 1 1 6052 4.1302e+06 13 1.4427e+05 75 49 93 52 46 46 59 45 1.3771e+05 1.123e+06 5.6144e+05 63145 8.062e+05 16 1 45 2 25 2 1 1 1 2 1 2 9279 4.1169e+06 13 2.03e+05 97 101 66 53 95 55 104 26 9.3644e+05 5.3697e+05 5 5 5 8 1 34 1 22 1 2 1 1 2 2 1 5638 4.3216e+06 14 1.4111e+05 120 61 64 51 78 90 113 23 3.9298e+05 8.8432e+05 5.8683e+05 1.8278e+05 7.8215e+05 9 2 40 2 32 2 2 2 1 2 1 1 11507 4.1656e+06 14 2.2287e+05 127 122 106 105 88 111 111 36 1.1337e+06 1.1119e+06 4.213e+05 4.3754e+05 1.2461e+05 8 2 58 1 34 2 1 1 1 2 1 1 8035 4.8965e+06 11 1.4951e+05 117 53 50 80 120 66 86 34 6.1495e+05 3.143e+05 83690 6.7149e+05 1.3515e+05 15 1 61 1 35 1 2 2 2 1 1 2 10843 4.1652e+06 10 1.9764e+05 86 105 70 86 83 87 47 33 9.001e+05 7.2146e+05 5 5 5 3 4 55 2 24 2 1 2 2 2 2 2 8476 4.4669e+06 14 1.6328e+05 53 101 50 95 112 97 68 27 1.1453e+06 2.3099e+05 4.5788e+05 3.1836e+05 2.5642e+05 4 3 56 1 27 1 2 2 2 2 2 2 6599 4.4485e+06 15 1.9064e+05 53 124 62 76 57 46 93 26 5.0676e+05 3.5918e+05 7.434e+05 4.0518e+05 1.6298e+05 6 4 35 2 23 2 2 1 1 1 1 2 4845 4.436e+06 10 1.1182e+05 115 121 63 127 95 124 93 42 1.0805e+06 76404 7.1716e+05 4.0431e+05 4.7772e+05 16 4 57 2 23 1 1 2 2 1 1 1 5925 4.0316e+06 15 1.1656e+05 86 109 118 119 55 103 84 32 1.6962e+05 7.8602e+05 6.6908e+05 5.3119e+05 2.8252e+05 6 2 33 1 25 2 1 2 2 2 2 2 9952 4.9947e+06 10 1.0902e+05 84 77 67 81 117 68 42 32 1.1352e+06 5.7275e+05 5 5 5 4 1 41 1 23 1 2 2 2 2 1 2 7961 4.5955e+06 14 94733 45 92 103 104 40 115 93 33 2.9338e+05 4.4058e+05 53098 18292 1.8734e+05 10 3 39 2 29 1 2 1 2 1 1 2 7136 4.6252e+06 10 2.1136e+05 70 102 76 58 111 95 58 25 9.9394e+05 9.9265e+05 96482 3.349e+05 7.6276e+05 15 4 33 2 24 1 2 2 2 2 1 2 6057 4.3008e+06 11 2.2214e+05 62 91 116 128 41 70 106 43 2.4343e+05 9.8137e+05 12504 3.6002e+05 7.5338e+05 6 3 43 2 34 2 2 2 1 1 1 1 6648 4.5293e+06 15 1.0987e+05 48 112 99 85 59 87 78 35 9.553e+05 5.4065e+05 5 5 5 9 4 51 1 34 2 1 2 2 1 1 2 11032 4.0526e+06 15 94503 41 54 128 64 71 89 87 34 7.6636e+05 5.3127e+05 7.376e+05 7.3486e+05 3.7284e+05 5 1 39 2 33 2 1 2 1 1 1 2 5234 4.9062e+06 12 1.9031e+05 61 120 113 75 88 114 99 43 4.8647e+05 45990 45578 7.3329e+05 19572 15 2 57 2 26 1 2 2 1 1 1 2 6038 4.7633e+06 13 1.2672e+05 51 118 98 42 93 53 83 45 2.8537e+05 1.8666e+05 5 5 5 9 3 47 2 29 1 1 2 1 2 1 2 5846 4.7535e+06 15 1.0473e+05 120 72 117 126 45 95 49 38 4.2614e+05 2.4778e+05 7.6702e+05 3.7712e+05 3.1515e+05 9 1 55 2 33 1 2 2 1 2 1 2 5383 3.9994e+06 15 1.8226e+05 96 49 59 88 62 58 81 41 1.1943e+06 9.2868e+05 29778 1.2425e+05 2.4405e+05 7 1 58 2 35 2 2 2 2 1 1 1 7378 3.9989e+06 10 2.0111e+05 57 110 128 96 69 105 72 26 5.5771e+05 2.8771e+05 6.2359e+05 66891 35044 5 2 47 2 25 2 1 2 2 2 1 2 7486 4.5995e+06 11 1.6735e+05 94 64 54 122 64 64 96 24 6.0406e+05 4.1631e+05 3.2335e+05 7.1666e+05 6.7855e+05 8 1 61 1 33 1 2 2 2 1 1 1 11770 4.5811e+06 13 1.2564e+05 42 47 82 102 48 76 53 34 1.1599e+06 3.1851e+05 4.6326e+05 3.8135e+05 2.8291e+05 10 1 37 1 27 2 2 1 2 2 2 2 6441 4.0755e+06 11 1.1874e+05 42 118 67 111 48 107 101 45 2.726e+05 91626 4.0452e+05 6.741e+05 2.4255e+05 11 3 41 1 29 1 2 1 1 2 1 2 10304 4.1526e+06 14 1.2081e+05 128 102 79 63 80 86 127 40 1.1652e+06 51508 3.6718e+05 5.8801e+05 7.4633e+05 16 3 60 2 32 2 2 1 2 2 2 1 7365 4.0232e+06 14 2.2247e+05 52 126 67 126 126 41 54 35 1.124e+05 4.8911e+05 4.6164e+05 3.3601e+05 2.8726e+05 5 2 54 1 29 1 1 1 2 2 1 1 10704 4.9116e+06 10 1.7173e+05 40 43 46 64 101 45 91 43 47190 5.81e+05 7.8978e+05 2.6294e+05 1.1897e+05 15 1 40 2 28 2 1 2 1 1 1 1 3009 4.3542e+06 14 95604 69 58 62 50 60 84 114 31 9.6129e+05 71146 28241 31034 1417 9 2 32 1 31 1 2 1 1 1 1 2 9956 3.9395e+06 14 1.9643e+05 78 81 90 48 68 83 128 39 8.551e+05 1.0252e+05 4.0731e+05 2.2001e+05 4.055e+05 7 4 58 2 33 1 2 2 2 1 2 2 6627 3.8896e+06 14 1.829e+05 106 69 127 99 47 103 111 33 1.0475e+06 3.2035e+05 3.4945e+05 5.4683e+05 6.4394e+05 13 1 37 2 23 2 2 1 1 2 2 1 10393 4.3086e+06 11 1.8411e+05 93 56 40 124 101 50 90 33 2.7135e+05 2.0633e+05 1.5122e+05 3.0773e+05 1.7452e+05 6 1 58 1 23 1 1 1 2 1 1 2 10236 4.7979e+06 14 1.0151e+05 127 92 94 113 96 126 108 39 2.7251e+05 1.0612e+06 2.3095e+05 2.02e+05 2.938e+05 14 3 36 1 23 2 2 1 2 2 2 2 4387 4.7359e+06 14 1.7768e+05 80 48 76 120 82 111 101 43 5.9425e+05 1.1569e+06 4.3651e+05 7.2828e+05 6.8529e+05 5 4 47 2 35 1 2 2 1 2 2 2 11924 3.9025e+06 14 97785 88 89 47 48 63 57 114 26 6.5167e+05 4.2273e+05 4.1209e+05 91529 3.7639e+05 12 3 50 1 33 2 2 1 1 2 1 2 10140 4.5704e+06 10 2.0983e+05 105 116 106 111 84 107 71 35 57911 8.6779e+05 7.5877e+05 3.1969e+05 1.5976e+05 13 2 44 1 31 1 1 1 1 2 1 2 3470 4.2125e+06 12 1.5034e+05 114 68 127 47 117 128 53 32 7.5107e+05 8.2558e+05 3.5592e+05 1.0095e+05 6.3417e+05 11 4 43 1 33 1 1 2 2 2 1 2 5420 4.282e+06 10 1.8699e+05 113 40 42 42 78 106 57 28 7.405e+05 1.1974e+06 1.8045e+05 5.2456e+05 2.9181e+05 15 2 54 1 33 2 1 1 2 1 2 1 6963 4.9724e+06 10 1.894e+05 125 48 118 59 70 105 110 25 8.5158e+05 6.9952e+05 6.784e+05 7.6577e+05 5.4642e+05 9 3 59 2 26 2 1 1 1 1 1 2 6249 4.3276e+06 12 1.381e+05 100 109 123 85 95 90 72 27 6.8607e+05 49524 61646 7.0453e+05 2.4327e+05 15 1 33 2 31 1 1 1 2 1 2 1 5094 4.6796e+06 15 1.233e+05 84 71 62 117 82 116 105 42 9.7329e+05 9.9875e+05 7.5606e+05 1.7506e+05 2.4159e+05 8 4 56 2 23 1 1 1 2 2 2 2 4797 4.0542e+06 14 1.7591e+05 70 58 100 112 80 76 101 43 5.0885e+05 1.3074e+05 43228 3.0062e+05 92124 13 2 41 1 33 2 1 1 1 2 2 2 5041 4.1432e+06 13 1.2087e+05 115 93 109 48 56 76 100 43 1.1898e+05 3.6624e+05 13412 5.7512e+05 3.316e+05 8 1 59 1 32 2 1 1 1 1 2 2 6901 4.4217e+06 12 2.1228e+05 84 41 85 97 44 89 125 32 2.5385e+05 2.5791e+05 3.8231e+05 6.7271e+05 3.8516e+05 5 2 47 1 27 2 2 1 1 1 1 1 7256 3.9396e+06 14 1.978e+05 55 84 124 59 113 44 67 44 5.9866e+05 5.7407e+05 3.5612e+05 78520 6.0707e+05 13 1 50 2 34 1 1 1 2 2 1 1 8219 4.0035e+06 14 2.177e+05 49 54 127 60 39 110 46 23 1.4965e+05 2.0861e+05 2.6658e+05 53715 4.8138e+05 11 2 39 2 30 1 2 1 1 1 1 1 4418 4.6514e+06 10 1.8401e+05 82 75 53 75 95 96 44 32 2.3267e+05 8.802e+05 1.7504e+05 1.8127e+05 1.8415e+05 14 3 48 1 33 1 1 2 1 1 1 2 6358 4.1492e+06 15 1.2098e+05 40 67 40 122 71 117 111 30 3.5056e+05 1.505e+05 5.8508e+05 6325 5.3733e+05 13 4 32 2 27 1 1 1 1 2 1 2 8669 4.6162e+06 11 1.3865e+05 75 46 56 120 84 91 56 23 5.0295e+05 5.9787e+05 3.3942e+05 1.1695e+05 7.9396e+05 13 1 33 2 24 1 1 1 2 1 1 2 9435 4.1163e+06 13 1.3034e+05 79 63 110 54 41 102 41 27 1.0106e+06 1.1522e+06 93316 6.3947e+05 63629 3 1 51 2 26 2 2 2 2 2 2 2 11144 4.8441e+06 10 1.651e+05 73 73 60 82 107 79 42 32 1.9234e+05 4.6814e+05 97485 2.6912e+05 1.6345e+05 8 4 50 2 23 2 1 2 1 1 2 2 5060 4.4344e+06 13 1.3825e+05 78 63 58 105 113 57 89 29 3.6543e+05 2.3943e+05 7.5551e+05 2.9027e+05 6.4382e+05 12 3 42 2 23 1 1 2 2 1 1 1 7766 3.962e+06 13 1.2563e+05 109 89 42 60 42 72 55 26 6.201e+05 5.4474e+05 7.4064e+05 4.6846e+05 2.0832e+05 12 2 48 2 33 2 1 2 2 2 1 2 10879 4.0978e+06 15 1.638e+05 104 126 107 74 103 97 114 38 1.0596e+05 2.1608e+05 2.9709e+05 7.5566e+05 7.8007e+05 9 1 45 1 31 1 2 2 2 1 2 2 11490 3.879e+06 15 1.7781e+05 121 65 71 44 100 128 110 27 4.2092e+05 1.1096e+06 6.557e+05 7.5341e+05 5.5461e+05 9 2 58 1 28 1 2 2 2 1 1 1 4082 4.7311e+06 13 2.1959e+05 98 86 53 118 128 103 119 26 1.7517e+05 68868 6.3267e+05 2.0772e+05 2.5893e+05 10 4 36 2 31 1 2 1 2 2 2 1 5078 4.6562e+06 15 1.9904e+05 89 98 44 59 124 41 45 28 1.1778e+06 7.1416e+05 93094 3.1418e+05 1.0276e+05 5 1 50 2 22 2 1 2 1 1 2 2 4580 4.489e+06 14 1.7049e+05 70 103 52 90 119 41 57 29 5.9024e+05 2.9061e+05 2.5659e+05 94684 5.4405e+05 14 4 40 2 23 1 1 1 2 2 1 2 7983 4.8722e+06 10 2.1756e+05 106 127 94 89 98 80 61 35 7.2781e+05 4.6575e+05 3.8292e+05 7.7546e+05 24476 4 2 37 1 27 2 2 1 1 2 2 2 5500 3.9777e+06 14 2.0349e+05 119 44 57 112 109 104 56 33 9.3676e+05 7.3754e+05 2.2739e+05 4.1016e+05 6.3676e+05 15 3 39 1 35 1 2 2 1 1 2 1 3956 4.3163e+06 15 1.1227e+05 107 127 98 116 91 114 76 41 6.3289e+05 4.502e+05 3.6842e+05 3.7663e+05 7.0399e+05 5 2 54 2 27 2 1 1 2 1 1 2 9532 4.5881e+06 15 1.6256e+05 57 78 116 114 40 56 95 25 6.7097e+05 1.1588e+06 7.0653e+05 2.5998e+05 1.9053e+05 12 1 43 2 23 2 2 1 2 2 1 1 3555 4.6022e+06 13 1.599e+05 92 61 66 80 83 80 106 36 3.2771e+05 9.2791e+05 5.5474e+05 3.7509e+05 1.5753e+05 4 3 61 1 25 1 1 2 2 2 2 2 4316 4.4792e+06 12 1.5153e+05 71 113 83 94 45 49 128 30 1.041e+06 99768 7.8782e+05 4.1362e+05 3.1702e+05 6 4 44 2 28 2 2 1 2 2 2 2 7045 4.622e+06 14 1.0125e+05 95 127 57 124 87 120 75 40 2.6582e+05 813 5.2e+05 4.8402e+05 4.5642e+05 7 4 59 1 25 1 2 1 1 1 1 2 7940 4.3303e+06 15 1.516e+05 39 101 91 113 95 110 44 32 9.2202e+05 1.1469e+06 21081 1.3626e+05 1.7217e+05 13 1 32 2 24 2 1 1 2 1 2 2 11994 4.1947e+06 15 1.7292e+05 79 62 104 58 94 115 89 22 1.6861e+05 1.3724e+05 5.5027e+05 3.0163e+05 2.3461e+05 8 4 32 1 26 1 2 2 2 2 1 2 8870 4.8186e+06 14 1.3569e+05 101 63 88 73 120 39 102 24 1.1145e+06 5.1834e+05 6.1156e+05 4.5609e+05 4.0275e+05 15 1 34 1 31 2 2 1 2 1 2 1 4250 4.4705e+06 11 2.2066e+05 83 55 127 55 46 110 121 45 1.7053e+05 1.0885e+06 72854 7.9502e+05 4.3373e+05 11 3 38 1 34 1 2 1 2 1 1 2 8702 4.3068e+06 14 2.232e+05 119 101 59 108 67 43 68 40 54743 3.9076e+05 1.9706e+05 3.8841e+05 3.3205e+05 8 4 61 1 26 1 2 2 1 2 2 1 5510 4.68e+06 15 1.1587e+05 108 63 100 51 96 86 101 39 5.7066e+05 6556 5.695e+05 2.366e+05 6.2716e+05 11 2 33 1 34 2 2 1 2 1 1 2 10654 4.3146e+06 11 1.1328e+05 42 101 85 75 64 117 125 25 6.022e+05 45436 1.7054e+05 1.9695e+05 6.4765e+05 8 2 56 1 30 2 1 1 1 2 1 2 9255 4.7529e+06 15 1.3269e+05 66 69 96 109 114 95 50 26 4.225e+05 1.0255e+06 5.3622e+05 4.5013e+05 4.8763e+05 11 1 56 1 26 1 2 2 2 1 2 2 5843 4.8526e+06 12 1.0358e+05 59 105 122 94 126 82 71 32 1.1691e+06 1.6196e+05 2.8309e+05 2.3769e+05 6.1868e+05 15 4 34 2 34 2 1 2 1 1 2 2 11611 4.2177e+06 10 1.2507e+05 60 94 63 93 70 89 43 30 1.1452e+06 8.2321e+05 45613 4.4707e+05 7.5443e+05 5 3 39 1 31 1 2 1 1 2 1 2 6227 4.5034e+06 10 1.132e+05 65 128 81 106 126 103 78 34 6.9414e+05 8.6913e+05 51949 2.618e+05 1.8862e+05 15 2 52 2 27 1 2 2 2 1 2 1 6798 3.8921e+06 12 1.5142e+05 109 67 59 93 115 128 91 25 1.1681e+06 6.9579e+05 4.0968e+05 6.3325e+05 96028 15 3 39 2 28 1 2 1 1 2 1 2 6622 3.954e+06 11 1.0631e+05 95 61 93 97 86 119 39 31 8.0619e+05 3.2302e+05 5.8428e+05 4.6342e+05 6.7062e+05 4 4 37 1 33 2 2 2 1 2 1 2 10339 4.9465e+06 14 1.6161e+05 83 108 104 110 61 69 70 28 6.8308e+05 2.3696e+05 7.0991e+05 5.5309e+05 5.633e+05 6 1 57 1 23 1 1 1 2 2 1 1 6038 4.058e+06 13 1.2777e+05 48 45 122 52 71 95 63 45 3.8451e+05 1.0123e+06 3.0109e+05 3.4506e+05 5.7412e+05 11 1 58 2 34 1 1 2 2 2 2 1 6028 3.9868e+06 12 2.1285e+05 118 96 123 39 70 71 63 42 7.6362e+05 5.9227e+05 32776 3.5979e+05 2.3785e+05 13 2 45 2 25 1 2 1 2 2 2 2 10393 3.8617e+06 13 1.561e+05 85 70 42 63 72 128 79 32 4.8811e+05 5.7915e+05 3.7296e+05 5.7336e+05 4.3716e+05 15 2 60 2 25 1 1 2 2 1 2 1 11110 4.1274e+06 10 99699 88 108 124 119 116 113 87 37 6.3287e+05 3.421e+05 5 5 5 3 4 43 1 26 2 2 2 2 1 2 2 3739 4.0081e+06 11 1.313e+05 113 51 74 115 88 102 93 31 8.506e+05 1.0112e+06 1.1333e+05 7.4276e+05 3.7876e+05 13 4 58 1 26 2 1 1 2 1 1 2 7255 4.137e+06 12 1.4287e+05 110 107 64 117 93 55 59 36 2.7242e+05 1.1865e+06 6.6025e+05 5.5564e+05 4.9857e+05 6 3 37 2 28 1 2 2 1 1 2 1 10303 4.8303e+06 13 1.9749e+05 68 81 62 93 61 99 43 26 4.5968e+05 1.1649e+06 5 5 5 9 4 40 1 31 1 2 2 1 1 1 1 7030 4.322e+06 10 2.1936e+05 104 70 61 64 118 99 62 41 8.7334e+05 9.4299e+05 7.9815e+05 7.7762e+05 3.9484e+05 11 2 44 1 31 2 2 1 1 2 1 2 6292 3.945e+06 11 1.9246e+05 51 47 69 46 77 79 42 27 3.3266e+05 6.4239e+05 2.7675e+05 1.2397e+05 1.4598e+05 15 1 36 2 34 2 1 2 1 2 2 1 11688 4.623e+06 13 1.0352e+05 52 76 107 112 123 84 120 27 7.47e+05 3.8502e+05 5 5 5 14 2 60 2 28 1 2 2 1 2 1 2 8839 4.3726e+06 12 2.1369e+05 105 128 69 64 50 77 64 23 80837 1.7681e+05 2.067e+05 1.6149e+05 1.2301e+05 12 1 46 2 35 1 2 2 1 1 1 1 3101 4.7531e+06 10 2.0641e+05 77 80 85 110 46 99 125 23 8.2155e+05 3.6444e+05 6.9843e+05 4.5276e+05 7.5819e+05 9 4 56 1 27 2 1 2 2 1 2 2 11489 4.5423e+06 11 1.8087e+05 97 96 75 86 58 104 111 31 6.8556e+05 4.4195e+05 5.7142e+05 2.2003e+05 7.3995e+05 14 4 43 2 23 1 1 1 1 1 2 1 5495 3.917e+06 13 1.6673e+05 109 99 53 46 53 92 119 30 95028 1.0561e+06 5 5 5 4 4 33 1 28 2 2 1 2 1 2 1 4151 4.3581e+06 13 1.2747e+05 110 86 97 112 108 87 128 41 8.0115e+05 3.4719e+05 5.7305e+05 82400 6.6756e+05 11 2 45 2 27 2 2 2 1 1 1 1 5463 4.864e+06 11 1.6473e+05 117 85 47 87 126 98 118 35 7.7133e+05 1.0244e+06 5 5 5 11 1 48 1 22 2 2 2 1 2 1 1 3782 4.9447e+06 11 2.2539e+05 70 100 56 45 97 126 101 33 3.2085e+05 1.3041e+05 1.9506e+05 1.178e+05 5.9851e+05 5 3 32 2 34 1 1 2 2 2 1 1 6004 4.7514e+06 13 1.8979e+05 47 114 51 82 106 93 114 31 61280 1.0363e+06 5 5 5 11 2 60 1 34 2 1 2 1 2 1 1 8189 4.3026e+06 15 2.2198e+05 106 68 124 127 101 121 39 24 47433 3.4106e+05 5 5 5 7 2 37 1 23 1 1 2 1 2 1 1 9179 4.1214e+06 14 2.202e+05 70 111 75 59 48 104 84 32 2.3284e+05 1.1435e+06 6.8463e+05 4.332e+05 2.9305e+05 15 2 61 2 32 1 1 1 2 1 2 1 10847 4.7663e+06 11 1.8663e+05 114 102 45 62 65 42 66 33 2.4288e+05 8.6783e+05 1.5275e+05 11872 60904 9 1 32 1 22 1 2 2 2 1 1 1 3153 4.1529e+06 11 2.0905e+05 69 92 42 101 107 78 112 30 5.2184e+05 96063 5 5 5 6 1 53 1 31 1 2 1 2 2 1 2 7947 4.232e+06 14 2.0219e+05 64 64 102 63 108 125 94 35 5.1068e+05 77648 5.0758e+05 4.3215e+05 8.0122e+05 6 3 60 2 35 2 2 2 2 1 1 2 12008 4.4611e+06 13 2.0546e+05 108 63 48 47 107 94 88 22 1.1194e+06 5.0542e+05 5.0953e+05 4.0541e+05 37122 14 1 56 2 28 1 2 1 1 1 2 2 3684 4.2615e+06 12 1.7455e+05 106 122 95 53 126 111 108 29 8.0124e+05 8.1616e+05 7.8447e+05 4.5272e+05 6.5642e+05 11 1 52 2 32 2 2 1 2 2 2 2 12027 4.6248e+06 13 97537 110 105 42 42 91 84 120 44 8.5482e+05 7.1576e+05 2.2808e+05 8.0474e+05 1.1887e+05 16 3 32 2 33 1 1 1 2 2 2 1 6587 4.2705e+06 14 1.6414e+05 107 39 39 83 103 52 103 45 1.0164e+06 2.2236e+05 5 5 5 15 4 59 2 34 2 1 1 1 1 2 1 11385 4.0356e+06 13 1.7191e+05 52 93 82 94 88 53 110 37 6.3856e+05 8.6483e+05 5 5 5 5 4 53 1 23 1 1 1 1 2 1 2 8258 3.9672e+06 13 1.5378e+05 109 68 122 101 97 50 118 22 5.1403e+05 6.9194e+05 6.0632e+05 6.9478e+05 7.1033e+05 15 1 57 2 33 1 1 2 2 2 1 2 4446 4.2479e+06 15 1.8708e+05 87 117 126 109 105 58 115 30 1.2008e+06 5.0868e+05 2.2304e+05 13136 2.2291e+05 12 1 57 1 28 1 2 2 1 2 2 1 6676 4.0574e+06 15 1.9255e+05 75 41 119 91 58 41 122 43 1.1383e+06 7.4242e+05 2206 3.0556e+05 7.0595e+05 7 4 33 1 22 1 1 1 2 1 1 1 3029 4.4736e+06 13 2.1772e+05 128 40 103 85 114 65 82 43 2.4359e+05 2.7308e+05 5.9221e+05 1.6906e+05 19406 7 3 39 1 30 1 2 2 1 1 2 2 11154 4.3411e+06 11 1.5561e+05 117 64 120 56 52 106 53 35 1.0737e+06 8.9172e+05 5 5 5 6 3 59 2 34 2 1 2 1 2 2 1 5328 4.1503e+06 10 1.8991e+05 92 106 111 61 75 127 48 30 9.1061e+05 4.0096e+05 6.4445e+05 4.0826e+05 7.189e+05 6 1 44 1 30 1 2 2 2 2 2 1 7864 4.7112e+06 13 1.6132e+05 40 61 88 64 120 106 49 38 3.9761e+05 1.11e+05 5 5 5 15 4 52 1 26 2 2 1 1 2 2 1 9743 4.035e+06 11 2.2298e+05 128 127 42 72 121 105 98 32 7.3395e+05 3.5132e+05 5 5 5 12 2 33 2 33 2 2 1 1 1 1 1 12094 5.018e+06 15 2.046e+05 123 127 44 78 72 125 73 42 8.7011e+05 5.5695e+05 5 5 5 11 3 49 1 25 1 2 1 2 2 2 2 7620 4.7873e+06 15 2.2232e+05 59 59 81 52 101 99 46 42 9.5866e+05 1.2532e+05 6.5882e+05 6.6202e+05 6.3334e+05 16 1 55 2 33 2 2 1 1 2 1 1 11101 4.1332e+06 10 2.1335e+05 82 115 74 78 44 116 124 40 1.8405e+05 2.3361e+05 5 5 5 8 4 43 1 35 2 1 2 2 2 1 2 9577 4.6134e+06 12 1.3704e+05 116 45 42 118 101 73 110 31 2.7724e+05 2.9084e+05 2.0099e+05 6.2885e+05 7.3481e+05 12 3 41 2 34 1 2 2 1 1 2 1 3177 4.7069e+06 14 1.1572e+05 97 122 108 103 93 111 91 41 1.1721e+06 6.5979e+05 2.4901e+05 71275 2.846e+05 14 4 57 2 29 2 1 2 1 2 2 1 3299 3.8504e+06 12 2.0457e+05 68 40 64 67 84 80 68 32 3.2526e+05 2.6951e+05 4.4772e+05 5.8144e+05 3.0829e+05 7 3 53 1 28 2 2 1 2 2 2 2 6513 4.6864e+06 14 1.5003e+05 78 126 64 80 125 95 59 43 1.0453e+06 8502 4.6291e+05 1.9345e+05 7.522e+05 8 1 39 2 23 2 2 1 1 1 2 2 5734 3.8717e+06 12 1.359e+05 114 41 41 124 106 73 74 40 7.0126e+05 9.4178e+05 5.5812e+05 2.8183e+05 4.937e+05 4 3 52 1 33 2 2 2 1 2 1 1 8538 4.0309e+06 10 2.1962e+05 50 68 84 119 75 48 76 22 6.6651e+05 2.873e+05 5.7137e+05 3.9792e+05 2.534e+05 13 4 34 2 33 1 2 2 1 1 1 2 4009 4.5113e+06 12 1.5425e+05 78 125 88 48 81 75 123 34 6.3117e+05 1.7421e+05 4.1781e+05 53563 3.9949e+05 15 3 45 2 31 2 1 1 1 1 1 1 8781 4.0651e+06 13 1.7694e+05 88 57 110 47 80 123 80 22 6.3763e+05 3.5245e+05 7.2816e+05 5.6314e+05 7.6684e+05 16 3 54 1 31 2 2 1 2 1 1 1 9446 4.5515e+06 10 1.9657e+05 83 119 125 54 122 126 62 25 1.1504e+06 4.8076e+05 7.9435e+05 4.9459e+05 6.5574e+05 16 1 39 2 30 2 1 1 1 2 2 2 3260 3.9666e+06 14 1.5681e+05 93 118 127 84 94 57 65 45 4.4213e+05 1.0431e+06 5.1035e+05 4.0744e+05 1.8772e+05 14 3 45 2 35 2 2 1 2 1 1 2 10704 3.9045e+06 11 1.41e+05 62 110 117 78 83 81 47 42 82420 7.9267e+05 2.0104e+05 6.095e+05 6.3407e+05 4 1 55 1 30 2 2 2 1 2 1 1 9369 4.2596e+06 14 1.1102e+05 115 61 39 115 40 43 68 36 1.1463e+06 1.0373e+05 6.5197e+05 44514 3.1283e+05 12 4 41 2 25 1 1 1 2 1 2 2 5628 4.685e+06 14 1.5366e+05 42 73 58 54 47 100 80 26 34963 4.8587e+05 5 5 5 11 1 52 2 26 1 2 2 1 2 1 2 5616 4.4127e+06 13 1.6661e+05 54 51 85 56 68 105 98 42 48740 3.2395e+05 7.9375e+05 1.4909e+05 6.9624e+05 15 3 56 1 25 2 2 2 1 1 1 1 3981 4.9635e+06 11 1.497e+05 93 87 118 118 92 95 128 29 4.6528e+05 1.0676e+06 3.1127e+05 5.7784e+05 24877 8 3 43 1 24 1 2 2 1 1 2 2 5162 4.6331e+06 10 2.1489e+05 85 117 60 123 122 73 59 28 5.335e+05 1.0591e+05 5 5 5 12 4 49 2 34 2 1 2 2 1 2 2 8846 4.3839e+06 14 1.0565e+05 127 115 104 59 42 73 50 22 3.2239e+05 3.5344e+05 5 5 5 6 2 55 1 28 2 1 1 1 2 1 1 5536 4.7604e+06 12 1.4804e+05 96 95 93 80 120 123 64 44 6.0918e+05 7.7703e+05 2.7812e+05 408 1.9466e+05 14 4 47 1 32 1 2 2 2 1 2 1 10592 4.6017e+06 14 1.2947e+05 41 58 120 58 41 114 104 32 55490 7.8029e+05 3.7529e+05 44343 3.6927e+05 3 3 61 1 27 2 2 2 2 2 2 2 7537 4.182e+06 12 1.2212e+05 60 71 75 90 71 117 114 32 5.4556e+05 5.4804e+05 5 5 5 12 2 33 2 22 1 2 2 2 1 2 2 3104 4.1636e+06 10 1.9498e+05 128 122 114 115 96 66 56 26 1.4483e+05 1.1125e+06 7.1682e+05 3.2588e+05 5.9883e+05 12 4 39 1 25 1 1 2 2 2 2 1 7102 4.3602e+06 14 97735 103 126 120 75 39 95 110 42 95488 6.1439e+05 5.861e+05 2.2038e+05 3.3502e+05 12 1 57 2 29 2 1 2 1 2 2 1 11022 4.0487e+06 12 1.5758e+05 43 85 61 43 112 95 72 44 3.8068e+05 8.4205e+05 5 5 5 15...
class(HCV)
ans = 'table'
size(HCV)
ans = 1×2
1385 29
sum(sum(ismissing(HCV)))
ans = 0
HCV_array=table2array(HCV);

Check for outliers

rmoutliers(HCV_array)
ans = 9×29
33 1 30 2 2 1 2 1 2 2 3178 4127797 12 180182 127 128 98 44 67 54 108 29 718359 664957 462214 456054 688986 15 4 35 1 26 2 2 1 2 1 2 2 8093 4144255 14 217810 95 75 73 93 43 90 127 38 1121894 95572 5 5 5 14 1 52 1 28 2 2 1 2 1 2 2 11065 4002769 15 148857 128 128 100 71 51 66 120 26 262262 595314 207550 87531 607145 9 3 56 1 24 2 2 1 2 1 2 2 5878 4564217 13 107505 62 107 104 121 73 120 73 32 1135021 126311 35471 725628 594410 9 1 45 1 35 2 2 1 2 1 2 2 4279 4796181 14 203108 99 90 118 71 105 64 84 23 76645 636923 18502 102445 243741 13 3 58 1 35 2 2 1 2 1 2 2 4972 4132934 15 118505 88 90 86 120 58 64 121 29 977436 1184754 554051 74441 112469 11 1 51 1 35 2 2 1 2 1 2 2 8795 4141988 12 163514 71 42 94 100 103 116 123 30 582174 331149 661693 161475 26065 6 4 36 1 24 2 2 1 2 1 2 2 9885 4622976 12 171369 122 59 53 128 55 101 81 27 667184 839870 791212 256983 35834 13 2 48 1 27 2 2 1 2 1 2 2 10098 4252898 14 223813 51 57 96 46 65 63 119 41 506343 808770 608544 319702 336238 5 3
% Without considering categorical data
[cdata,tf]=rmoutliers(HCV_array(:,[1,3,11:end]))
cdata = 1381×21
46 29 12101 4429425 10 129367 91 123 95 75 113 57 123 44 40620 538635 637056 336804 31085 4 2 49 33 6490 4794631 10 146457 43 64 109 80 88 48 77 33 1041941 449939 585688 744463 582301 10 3 58 22 11785 3882456 15 131228 66 104 121 96 65 73 114 29 1157452 1086852 5 5 5 4 4 42 26 11620 4747333 12 177261 78 57 113 118 107 84 80 28 325694 1034008 275095 214566 635157 12 4 48 30 7335 4405941 11 216176 119 112 80 127 45 96 53 39 641129 72050 787295 370605 506296 12 3 44 23 10480 4608464 12 148889 93 83 55 102 97 122 39 45 591441 757361 5 371090 203042 5 2 45 30 6681 4455329 12 98200 55 68 72 127 81 125 43 30 1151206 230488 267320 275295 555516 4 2 37 24 4437 4265042 12 166027 103 124 111 74 53 123 101 33 1023123 103190 731929 448466 59998 15 2 36 22 6052 4130219 13 144266 75 49 93 52 46 46 59 45 137712 1122999 561438 63145 806204 16 1 45 25 9279 4116937 13 203003 97 101 66 53 95 55 104 26 936444 536969 5 5 5 8 1
tf = 1385×1 logical array
1 0 1 0 1 0 0 0 0 0
for i=1:1385
if tf(i)==1
HCV_array(i,:)=[];
end
end
cdatat=array2table(cdata,'VariableNames',HCV.Properties.VariableNames([1,3,11:end]))
cdatat = 1381×21 table
 AgeBMIWBCRBCHGBPlatAST1ALT1ALT4ALT12ALT24ALT36ALT48ALTafter24wRNABaseRNA4RNA12RNAEOTRNAEFBaselinehistologicalGradingBaselinehistologicalstaging
14629121014429425101293679112395751135712344406205386356370563368043108542
249336490479463110146457436410980884877331041941449939585688744463582301103
358221178538824561513122866104121966573114291157452108685255544
442261162047473331217726178571131181078480283256941034008275095214566635157124
548307335440594111216176119112801274596533964112972050787295370605506296123
6442310480460846412148889938355102971223945591441757361537109020304252
74530668144553291298200556872127811254330115120623048826732027529555551642
837244437426504212166027103124111745312310133102312310319073192944846659998152
9362260524130219131442667549935246465945137712112299956143863145806204161
104525927941169371320300397101665395551042693644453696955581
113422563843216031414111012061645178901132339297688432258683418277578215492
1240321150741656031422287412712210610588111111361133727111187142130443754412460982
135834803548964641114950611753508012066863461495131429683690671490135145151
146135108434165219101976408610570868387473390009972146055534
1555248476446688514163276531015095112976827114531023099345788231836325641543
16562765994448466151906425312462765746932650675635918174339940517516298364
173523484544360251011181911512163127951249342108049976404717159404314477719164
18572359254031637151165588610911811955103843216962478601766907653118728252462
193325995249947291010902384776781117684232113520057274755541
20412379614595487149473345921031044011593332933804405765309818292187341103
213929713646252481021136370102765811195582599394099265296482334897762760154
223324605743007741122213562911161284170106432434339813701250436001575338363
23433466484529290151098714811299855987783595529654065455594
24513411032405258315945034154128647189873476635553126973760373486337283751
25393352344906158121903146112011375881149943486467459904557873329219572152
26572660384763261131267215111898429353834528537418665755593
2747295846475353115104729120721171264595493842613624777776701537712331515091
2855335383399938815182262964959886258814111943019286792977812425024404971
29583573783998925102011145711012896691057226557708287714623587668913504452
30472574864599496111673549464541226464962460406341631332335271665567854881
31613311770458109913125642424782102487653341159877318505463260381345282914101
32372764414075477111187424211867111481071014527260091626404523674101242552113
334129103044152639141208121281027963808612740116516651508367178588014746328163
3460327365402321514222471521266712612641543511240148911246164133600628726152
355429107044911615101717254043466410145914347190581000789780262941118971151
36402830094354206149560469586250608411431961292711462824131034141792
37323199563939529141964337881904868831283985509910252040730622000640549774
3858336627388964914182897106691279947103111331047535320353349454546832643942131
3937231039343086381118411393564012410150903327134920632915121730772917452361
40582310236479792314101512127929411396126108392725071061189230947201997293804143
41362343874735873141776778048761208211110143594248115685943651272827968528654
42473511924390248814977858889474863571142665167142272941208691529376394123
4350331014045704171020982710511610611184107713557911867787758773319688159764132
444431347042124991215034411468127471171285332751073825583355919100948634168114
45433354204281958101869921134042427810657287405021197447180453524563291808152
46543369634972412101894001254811859701051102585158269952167839676577354642393
475926624943276271213809910010912385959072276860654952461646704530243265151
4833315094467959115123295847162117821161054297329499874575606017506224159384
495623479740541541417590670581001128076101435088491307394322830061692124132
504133504141431651312087111593109485676100431189843662381341257511633159981
51593269014421722122122798441859744891253225384825791338230667271138516152
524727725639396061419779855841245911344674459866157407035612278520607065131
535034821940034771421769949541276039110462314965020861126657853715481377112
54393044184651439101840138275537595964432232668880198175039181268184145143
554833635841491531512097540674012271117111303505641504965850776325537325134
563227866946162301113864875465612084915623502947597869339418116953793961131
573324943541163041313034279631105441102412710106491152238933166394656362931
58512611144484405310165099737360821077942321923434681409748526911516344784
5950235060443437213138252786358105113578929365429239426755513290266643822123
604223776639619791312562510989426042725526620102544738740639468459208323122
6148331087940977921516379510412610774103971143810595821607629708675565678006691
6245311149038789971517780912165714410012811027420923110960565570375341155461292
63582840824731084132195909886531181281031192617516968868632671207722258934104
64363150784656201151990428998445912441452811778157141609309431417710275851
655022458044890311417048970103529011941572959024429061025658894684544054144
66402379834872198102175631061279489988061357278114657533829227754582447642
673727550039777141420348911944571121091045633936761737542227389410164636763153
68393539564316260151122731071279811691114764163289345019636841637662770398952
69542795324588105151625645778116114405695256709691158751706532259979190526121
70432335554602228131598969261668083801063632770692790655474037508715752843
716125431644792471215152671113839445491283010410419976878781541361631702164
7244287045462203514101254951275712487120754026582281352000448401745641974
73592579404330272151515973910191113951104432922015114691421081136255172170131
7432241199441946961517291579621045894115892216861313724055026830162623461384
753226887048185721413568910163887312039102241114527518344611557456092402753151
76343142504470537112206618355127554611012145170527108848672854795019433726113
773834870243068481422319611910159108674368405474339076019706038841033204584
786126551046800491511586710863100519686101395706626556569496236595627163112
7933341065443146171111328142101857564117125256022004543617054019695164764582
80563092554752903151326946669961091149550264225041025482536220450133487629111
815626584348526361210357559105122941268271321169124161960283093237688618676154
82343411611421773010125068609463937089433011451568232074561344706775443053
83393162274503431101131996512881106126103783469414286913051949261800188617152
84522767983892146121514191096759931151289125116809469579140968063324796028153
85392866223954021111063099561939786119393180618632301958427746341767062244
86373310339494647914161613831081041106169702868308223696070990955308556330261
8757236038405799913127765484512252719563453845121012307301092345055574121111
885834602839868141221285211896123397071634276362459227232776359787237850132
8945251039338616751315609585704263721287932488111579154372958573360437164152
906025111104127397109969988108124119116113873763286834209655534
914326373940081451113129511351741158810293318506031011234113328742762378760134
92582672554136958121428691101076411793555936272423118650366024555564249856763
933728103034830312131974866881629361994326459679116487155594
9440317030432200810219360104706164118996241873343942992798150777618394841112
95443162923944957111924555147694677794227332660642393276746123973145977151
9636341168846229501310352152761071121238412027746998385015555142
976028883943726381221368710512869645077642380837176811206701161492123008121
984635310147531251020640977808511046991252382155236443569842545275775818694
99562711489454234411180872979675865810411131685564441952571415220033739954144
1004323549539169991316673410999534653921193095028105607855544
HCV_array
HCV_array = 1381×29
46 1 29 1 2 2 1 2 2 1 12101 4429425 10 129367 91 123 95 75 113 57 123 44 40620 538635 637056 336804 31085 4 2 57 1 33 2 2 2 2 1 1 1 4178 4621191 12 151522 113 49 95 107 116 5 5 5 571148 661346 5 735945 558829 4 4 59 1 32 1 1 2 1 2 2 2 3661 4606375 11 187684 99 104 67 48 120 94 90 30 660410 738756 3731527 338946 242861 11 1 58 2 22 2 2 2 1 2 2 1 11785 3882456 15 131228 66 104 121 96 65 73 114 29 1157452 1086852 5 5 5 4 4 48 2 30 1 1 2 2 1 1 2 7335 4405941 11 216176 119 112 80 127 45 96 53 39 641129 72050 787295 370605 506296 12 3 44 1 23 1 1 2 2 2 1 2 10480 4608464 12 148889 93 83 55 102 97 122 39 45 591441 757361 5 371090 203042 5 2 45 1 30 2 1 2 2 1 1 2 6681 4455329 12 98200 55 68 72 127 81 125 43 30 1151206 230488 267320 275295 555516 4 2 37 2 24 2 1 2 1 2 2 1 4437 4265042 12 166027 103 124 111 74 53 123 101 33 1023123 103190 731929 448466 59998 15 2 36 1 22 2 2 1 1 1 1 1 6052 4130219 13 144266 75 49 93 52 46 46 59 45 137712 1122999 561438 63145 806204 16 1 45 2 25 2 1 1 1 2 1 2 9279 4116937 13 203003 97 101 66 53 95 55 104 26 936444 536969 5 5 5 8 1
% Cross-verify with histograms and boxplots
for z=1:19
figure
histogram2(cdata(:,z),cdata(:,end))
end
for z=1:19
figure
histogram(cdata(:,z),'Normalization',"pdf")
ylabel(cdatat.Properties.VariableNames(z))
end
for z=1:19
figure
histfit(cdata(:,z))
ylabel(cdatat.Properties.VariableNames(z))
end
for z=1:19
figure
boxplot(cdata(:,z),cdata(:,end))
xlabel('Stage of liver fibrosis')
ylabel(HCV.Properties.VariableNames(z))
end

Basic Statistical Values

% Stage and Continuous variables
sv=zeros(21,7);
for i=1:21
sv(i,1)=min(cdatat.(i));
sv(i,2)=max(cdatat.(i));
sv(i,3)=range(cdatat.(i));
sv(i,4)=mean(cdatat.(i));
sv(i,5)=median(cdatat.(i));
sv(i,6)=std(cdatat.(i));
sv(i,7)=corr(cdatat.(i),cdatat.(21));
end
array2table(sv,'RowNames',cdatat.Properties.VariableNames,"VariableNames",{'Minimum','Maximum','Range','Mean','Median','Std.Dev.','Correlation'})
ans = 21×7 table
 MinimumMaximumRangeMeanMedianStd.Dev.Correlation
1 Age32612946.3049468.7717-0.0176
2 BMI22351328.5945294.0731-0.0584
3 WBC29911210191107.5388e+0375142.6685e+030.0178
4 RBC3816422501845112020294.4223e+0644384653.4657e+050.0106
5 HGB1015512.5902131.71350.0039
6 Plat930132264641334511.5835e+051579163.8817e+04-0.0174
7 AST1391288982.75818325.9888-0.0240
8 ALT1391288983.92408325.93750.0389
9 ALT4391288983.43598326.5488-0.0163
10 ALT12391288983.49678426.0676-0.0013
11 ALT24391288983.65248326.2091-0.0052
12 ALT36391288983.27958426.1830-0.0015
13 ALT48391288983.79588426.0037-0.0088
14 ALTafter24w22452333.5025346.95770.0404
% Stages and Categorical variables
cadat=HCV_array(:,[2,4:10,29]);
cs=zeros(16,5);
for j=1:8
cs(2*j-1,1)=sum(cadat(:,j)==1 & cadat(:,9)==1); % Stage 1
cs(2*j-1,2)=sum(cadat(:,j)==1 & cadat(:,9)==2); % Stage 2
cs(2*j-1,3)=sum(cadat(:,j)==1 & cadat(:,9)==3); % Stage 3
cs(2*j-1,4)=sum(cadat(:,j)==1 & cadat(:,9)==4); % Stage 4
cs(2*j-1,5)=sum(cadat(:,j)==1); % Total
cs(2*j,1)=sum(cadat(:,j)==2 & cadat(:,9)==1); % Stage 1
cs(2*j,2)=sum(cadat(:,j)==2 & cadat(:,9)==2); % Stage 2
cs(2*j,3)=sum(cadat(:,j)==2 & cadat(:,9)==3); % Stage 3
cs(2*j,4)=sum(cadat(:,j)==2 & cadat(:,9)==4); % Stage 4
cs(2*j,5)=sum(cadat(:,j)==2); % Total
end
array2table(cs,'RowNames',{'Male','Female','Fever Absent','Fever Present','NausaeVomiting Absent',...
'NausaeVomiting Present','Headache Absent','Headache Present','Diarrhea Absent','Diarrhea Present',...
'Fatigue Absent','Fatigue Present','Jaundice Absent', 'Jaundice Present','GastricPain Absent', ...
'Gastric Pain Present'},'VariableNames',{'Stage 1','Stage 2','Stage 3','Stage 4','Total'})
ans = 16×5 table
 Stage 1Stage 2Stage 3Stage 4Total
1 Male172181162190705
2 Female164149192171676
3 Fever Absent157159166186668
4 Fever Present179171188175713
5 NausaeVomiting Absent180171165170686
6 NausaeVomiting Present156159189191695
7 Headache Absent167168180181696
8 Headache Present169162174180685
9 Diarrhea Absent160171178178687
10 Diarrhea Present176159176183694
11 Fatigue Absent175158183176692
12 Fatigue Present161172171185689
13 Jaundice Absent179154182176691
14 Jaundice Present157176172185690
%Data normalization
HCV_arr=normalize(HCV_array,'range');
dat=array2table(HCV_arr,'VariableNames',HCV.Properties.VariableNames)
dat = 1381×29 table
 AgeGenderBMIFeverNauseaVomtingHeadacheDiarrheaFatiguegeneralizedboneacheJaundiceEpigastricpainWBCRBCHGBPlatAST1ALT1ALT4ALT12ALT24ALT36ALT48ALTafter24wRNABaseRNA4RNA12RNAEOTRNAEFBaselinehistologicalGradingBaselinehistologicalstaging
10.482800.538501101101.00000.510000.27240.58430.94380.62920.40450.83150.42280.95930.97500.03380.44820.17070.41660.03840.07690.3333
20.862100.846211110000.13030.66950.40000.43840.83150.11240.62920.76400.86520000.47550.550300.91030.68960.07691.0000
30.931000.769200101110.07350.65720.20000.70940.67420.73030.31460.10110.91010.72360.69110.62500.54980.61471.00000.41930.29970.61540
40.89661011101100.96530.05491.00000.28640.30340.73030.92130.64040.29210.55280.88620.60000.96370.90440000.07691.0000
50.551710.615400110010.47680.49040.20000.92290.89890.82020.46070.98880.06740.73980.39020.85000.53380.06000.21100.45840.62480.69230.6667
60.413800.076900111010.82210.65890.40000.41870.60670.49440.17980.70790.65170.95120.27641.00000.49240.630200.45900.25060.15380.3333
70.448300.615410110010.40500.53150.40000.03890.17980.32580.37080.98880.47190.97560.30890.62500.95850.19180.07160.34050.68550.07690.3333
80.172410.153810101100.15870.37320.40000.54710.71910.95510.80900.39330.15730.95930.78050.70000.85180.08590.19610.55470.07400.92310.3333
90.13790011000000.33600.26110.60000.38410.40450.11240.60670.14610.07870.33330.43901.00000.11460.93450.15050.07810.99491.00000
100.448310.230810001010.69020.25000.60000.82420.65170.69660.30340.15730.62920.40650.80490.52500.77970.44680000.38460
110.06900001001100.29060.42030.80000.36040.91010.24720.28090.13480.43820.69110.87800.45000.32720.73590.15730.22610.96520.46150.3333
120.275910.769211101000.93480.29050.80000.97310.98880.93260.75280.74160.55060.86180.86180.77500.94390.92520.11290.54120.15380.38460.3333
130.896600.923110001000.55370.89850.20000.42330.87640.15730.12360.46070.91010.49590.65850.72500.51200.26150.02240.83060.16680.92310
141.000001.000001110010.86190.290200.78400.52810.74160.34830.52810.49440.66670.34150.70000.74940.600400001.0000
150.793110.153810111110.60210.54110.80000.52650.15730.69660.12360.62920.82020.74800.51220.55000.95360.19220.12270.39380.31640.07690.6667
160.827600.384601111110.39600.52581.00000.73160.15730.95510.25840.41570.20220.33330.71540.52500.42190.29890.19920.50120.20110.23081.0000
170.103410.076911000010.20350.515500.14090.85390.92130.26970.98880.62920.96750.71540.92500.89960.06360.19220.50010.58951.00001.0000
180.862110.076900110000.32210.17901.00000.17640.52810.78650.88760.89890.17980.79670.64230.67500.14120.65410.17930.65700.34860.23080.3333
190.034500.230810111110.76410.980300.12000.50560.42700.31460.47190.87640.51220.30080.67500.94510.47660000.07690
200.310300.076901111010.54560.64810.80000.01290.06740.59550.71910.73030.01120.89430.71540.70000.24430.36660.01420.02260.23120.53850.6667
210.241410.538501010010.45500.672900.88680.34830.70790.41570.21350.80900.73170.43090.50000.82750.82600.02590.41420.94130.92311.0000
220.034510.153801111010.33660.40290.20000.96760.25840.58430.86521.00000.02250.52850.82110.95000.20270.81660.00330.44530.92970.23080.6667
230.379310.923111100000.40140.59311.00000.12630.10110.82020.67420.51690.22470.66670.59350.75000.79540.44990000.46151.0000
240.655200.923110110010.88270.19651.00000.01120.02250.16851.00000.28090.35960.68290.66670.72500.63800.44210.19770.90900.46010.15380
250.241410.846210100010.24620.90660.40000.72910.24720.91010.83150.40450.55060.88620.76420.95000.40500.03830.01220.90700.02410.92310.3333
260.862110.307701100010.33450.78770.60000.25260.13480.88760.66290.03370.60670.39020.63411.00000.23760.15530000.46150.6667
270.517210.538500101010.31340.77961.00000.08780.91010.37080.87640.97750.06740.73170.35770.82500.35480.20620.20550.46650.38890.46150
280.793110.846201101010.26260.15221.00000.66880.64040.11240.22470.55060.25840.43090.61790.90000.99440.77280.00800.15370.30120.30770
290.896611.000011110000.48160.151800.81000.20220.79781.00000.64040.33710.81300.54470.52500.46430.23940.16710.08270.04320.15380.3333
300.517210.230810111010.49340.65150.20000.55710.61800.28090.16850.93260.28090.47970.73980.47500.50290.34640.08670.88650.83740.38460
311.000000.846201110000.96370.63620.60000.24450.03370.08990.48310.70790.10110.57720.39020.72500.96570.26500.12410.47170.34910.53850
320.172400.384611011110.37870.21550.20000.19280.03370.88760.31460.80900.10110.82930.78051.00000.22700.07620.10840.83380.29930.61540.6667
330.310300.538501001010.80270.27970.80000.20831.00000.70790.44940.26970.46070.65850.99190.87500.97010.04290.09840.72730.92101.00000.6667
340.965510.769211011100.48010.17200.80000.97010.14610.97750.31460.97750.97750.29270.39840.75000.09360.40700.12370.41560.35450.15380.3333
350.758600.538500011000.84670.911100.58980.01120.04490.07870.28090.69660.32520.69920.95000.03930.48350.21160.32520.14680.92310
360.275910.461510100000.00200.44740.80000.01940.33710.21350.25840.12360.23600.64230.88620.65000.80040.05920.00760.03840.00170.46150.3333
37000.692301000010.76450.10240.80000.77500.43820.47190.57300.10110.32580.63411.00000.85000.71190.08530.10920.27210.50040.30771.0000
380.896610.846201110110.39910.06090.80000.67350.75280.33710.98880.67420.08990.79670.86180.70000.87220.26660.09360.67640.79470.76920
390.172410.076911001100.81250.40950.20000.68260.60670.19100.01120.95510.69660.36590.69110.70000.22590.17170.04050.38060.21540.23080
400.896600.076900010010.79530.81650.80000.06370.98880.59550.61800.83150.64040.98370.83740.85000.22690.88310.06190.24990.36260.84620.6667
410.137900.076911011110.15320.76490.80000.63440.46070.10110.41570.91010.48310.86180.78050.95000.49480.96270.11700.90080.84570.15381.0000
420.517211.000001101110.98060.07160.80000.03580.55060.56180.08990.10110.26970.42280.88620.52500.54260.35180.11040.11320.46450.69230.6667
430.620700.846211001010.78470.627300.87530.74160.86520.75280.80900.50560.82930.53660.75000.04820.72210.20330.39540.19720.76920.3333
440.413800.692300001010.05260.32950.40000.42960.84270.32580.98880.08990.87641.00000.39020.67500.62530.68700.09540.12490.78260.61541.0000
450.379300.846200111010.26660.387300.70420.83150.01120.03370.03370.43820.82110.42280.57500.61650.99640.04840.64880.36010.92310.3333
460.758600.846210010100.43600.961700.72230.96630.10110.88760.22470.34830.81300.85370.50000.70900.58210.18180.94720.67430.46150.6667
470.931010.307710000010.35760.42530.40000.33780.68540.78650.94380.51690.62920.69110.54470.55000.57120.04120.01650.87150.30020.92310
480.034510.692300010100.23080.71811.00000.22690.50560.35960.25840.87640.48310.90240.81300.92500.81030.83110.20260.21650.29810.38461.0000
490.827610.076900011110.19820.19780.80000.62110.34830.21350.68540.82020.46070.57720.78050.95000.42370.10880.01160.37180.11370.76920.3333
500.310300.846210001110.22500.27180.60000.20880.85390.60670.78650.10110.19100.57720.77240.95000.09910.30480.00360.71140.40920.38460
510.931000.769210000110.42920.50360.40000.89370.50560.02250.51690.65170.05620.68290.97560.67500.21130.21460.10250.83210.47530.15380.3333
520.517200.384611000000.46820.10250.80000.78520.17980.50560.95510.22470.83150.31710.50410.97500.49840.47770.09540.09710.74920.76920
530.620710.923100011000.57390.15560.80000.93430.11240.16850.98880.236000.85370.33330.45000.12460.17360.07140.06640.59400.61540.3333
540.241410.615401000000.15660.694700.68190.48310.40450.15730.40450.62920.73980.31710.67500.19370.73250.04690.22420.22720.84620.6667
550.551700.846200100010.36960.27681.00000.20950.01120.31460.01120.93260.35960.91060.86180.62500.29190.12520.15680.00780.66310.76921.0000
56010.384600001010.62330.66540.20000.34200.40450.07870.19100.91010.50560.69920.41460.45000.41870.49750.09100.14470.97980.76920
570.034510.153800010010.70740.24950.60000.27970.44940.26970.79780.16850.02250.78860.29270.55000.84140.95880.02500.79100.078500
580.655210.307711111110.89500.854900.54020.38200.38200.23600.48310.76400.60160.30080.67500.16010.38960.02610.33290.20170.38461.0000
590.620710.076910100110.22710.51410.60000.33900.43820.26970.21350.74160.83150.42280.68290.60000.30420.19920.20250.35900.79450.69230.6667
600.344810.076900110000.52410.12110.60000.24440.78650.56180.03370.23600.03370.54470.40650.52500.51630.45330.19850.57950.25710.69230.3333
610.551710.846210111010.86590.23411.00000.53040.73030.97750.76400.39330.71910.74800.88620.82500.08820.17980.07960.93470.96260.46150
620.448300.692301110110.93290.05211.00000.63540.92130.29210.35960.05620.68541.00000.85370.55000.35040.92340.17570.93190.68440.46150.3333
630.896600.461501110000.11980.76090.60000.94850.66290.52810.15730.88761.00000.79670.92680.52500.14580.05730.16950.25690.31950.53851.0000
640.137910.692301011100.22910.69861.00000.79450.56180.66290.05620.22470.95510.29270.32520.57500.98060.59430.02490.38860.12680.15380
650.62071010100110.17440.55960.80000.58060.34830.71910.14610.57300.89890.29270.42280.60000.49140.24180.06880.11710.67140.84621.0000
660.275910.076900011010.54800.878300.93330.75280.98880.61800.56180.66290.60980.45530.75000.60600.38760.10260.95920.03020.07690.3333
670.172400.384611001110.27540.13420.80000.82780.89890.05620.20220.82020.78650.80490.41460.70000.77990.61370.06090.50730.78580.92310.6667
680.241401.000001100100.10590.41581.00000.14430.76400.98880.66290.86520.58430.88620.57720.90000.52690.37460.09870.46590.86880.15380.3333
690.758610.384610010010.71800.64201.00000.52120.20220.43820.86520.84270.01120.41460.73170.50000.55860.96420.18930.32160.23510.69230
700.379310.076911011000.06190.65370.60000.50120.59550.24720.30340.46070.49440.60980.82110.77500.27280.77220.14870.46400.19440.07690.6667
711.000000.230800111110.14540.55140.40000.43850.35960.83150.49440.61800.06740.35771.00000.62500.86670.08300.21110.51160.39120.23081.0000
720.413810.461511011110.44500.67020.80000.06180.62920.98880.20220.95510.53930.93500.56910.87500.22130.00070.13940.59870.56320.30771.0000
730.931000.230801000010.54320.42751.00000.439000.69660.58430.83150.62920.85370.31710.67500.76760.95440.00560.16850.21250.76920
74010.153810010110.98830.31471.00000.59870.44940.25840.73030.21350.61800.89430.68290.42500.14040.11420.14750.37310.28950.38461.0000
75000.307701111010.64530.83370.80000.31980.69660.26970.55060.38200.91010.27640.78860.47500.92790.43130.16390.56420.49700.92310
760.069000.692311010100.13820.54420.20000.95650.49440.17980.98880.17980.07870.85370.94311.00000.14200.90580.01950.98340.53520.61540.6667
770.206900.923101010010.62690.40800.80000.97550.89890.69660.22470.77530.31460.30890.51220.87500.04560.32520.05280.48040.40980.38461.0000
781.000000.307701101100.27650.71851.00000.17130.77530.26970.68540.13480.64040.65850.78050.85000.47510.00550.15260.29260.77400.61540.3333
790.034500.923111010010.84120.41450.20000.15190.03370.69660.51690.40450.28090.91060.97560.50000.50140.03780.04570.24360.79920.38460.3333
800.827600.615410001010.68760.77911.00000.29730.30340.33710.64040.78650.84270.73170.36590.52500.35180.85330.14370.55680.60180.61540
810.827600.307701110110.31310.86210.40000.07910.22470.74160.93260.61800.97750.62600.53660.67500.97340.13480.07590.29400.76350.92311.0000
820.069010.923110100110.94620.333900.24020.23600.61800.26970.60670.34830.68290.30890.62500.95340.68500.01220.55300.93100.15380.6667
830.241400.692301001010.35520.571500.15130.29211.00000.47190.75280.97750.79670.59350.72500.57790.72320.01390.32380.23280.92310.3333
840.689710.384601110100.41790.06300.40000.43770.78650.31460.22470.60670.85391.00000.69920.50000.97250.57900.10980.78330.11850.92310.6667
850.241410.461501001010.39860.11450.20000.09960.62920.24720.60670.65170.52810.92680.27640.65000.67120.26880.15660.57320.82760.07691.0000
860.172400.846211101010.80660.94010.80000.51400.49440.77530.73030.79780.24720.52030.52850.57500.56870.19720.19020.68410.69510.23080
870.862100.076900011000.33450.20100.60000.26040.10110.06740.93260.14610.35960.73170.47151.00000.32010.84240.08070.42680.70850.61540
880.896610.923100111100.33340.14180.40000.89800.88760.64040.943800.34830.53660.47150.92500.63580.49290.00880.44500.29350.76920.3333
890.448310.230801011110.81250.03760.60000.47270.51690.34830.03370.26970.37081.00000.60160.67500.40640.48190.09990.70920.53950.92310.3333
900.965510.230800110100.89120.258700.05010.55060.77530.95510.89890.86520.87800.66670.80000.52690.284700001.0000
910.379300.307711110110.08210.15950.20000.28690.83150.13480.39330.85390.55060.78860.71540.65000.70820.84150.03040.91870.46740.76921.0000
920.896600.307710010010.46810.26670.40000.37360.79780.76400.28090.87640.60670.40650.43900.77500.22680.98730.17690.68730.61530.23080.6667
930.172410.461501100100.80260.84350.60000.78290.32580.47190.25840.60670.24720.76420.30890.52500.38270.96930000.46151.0000
940.275900.692301100000.44340.420600.94680.73030.34830.24720.28090.88760.76420.46340.90000.72710.78470.21390.96190.48730.61540.3333
950.413800.692311001010.36230.10690.20000.74520.13480.08990.33710.07870.42700.60160.30080.55000.27700.53460.07420.15330.18010.92310
960.137910.923110101100.95470.67100.60000.07870.14610.41570.76400.82020.94380.64230.93500.55000.62190.32040000.84620.3333
970.965510.461501101010.64190.46270.40000.90430.74161.00000.33710.28090.12360.58540.47970.45000.06730.14710.05540.19980.15180.69230
980.482811.000001100000.01210.779300.84970.42700.46070.51690.79780.07870.76420.97560.45000.68400.30330.18720.56000.93560.46151.0000
990.827600.384610110110.93280.60390.20000.65840.65170.64040.40450.52810.21350.80490.86180.65000.57080.36780.15310.27220.91310.84621.0000
1000.379310.076900000100.27490.08370.60000.55240.78650.67420.15730.07870.15730.70730.92680.62500.07910.87880000.07691.0000

Correlation

array2table(corr(HCV_arr),'Rownames',dat.Properties.VariableNames,'VariableNames',dat.Properties.VariableNames)
ans = 29×29 table
 AgeGenderBMIFeverNauseaVomtingHeadacheDiarrheaFatiguegeneralizedboneacheJaundiceEpigastricpainWBCRBCHGBPlatAST1ALT1ALT4ALT12ALT24ALT36ALT48ALTafter24wRNABaseRNA4RNA12RNAEOTRNAEFBaselinehistologicalGradingBaselinehistologicalstaging
1 Age1.0000-0.0083-0.0274-0.0213-0.02410.01820.0454-0.00730.0088-0.01190.0138-0.0003-0.0125-0.0024-0.01570.00590.03270.01810.0044-0.00600.03000.00810.0218-0.01280.0004-0.0477-0.0286-0.0413-0.0185
2 Gender-0.00831.00000.00750.0231-0.0383-0.02410.01530.04560.0007-0.02380.0276-0.00400.00000.0196-0.01250.0235-0.01220.0105-0.01800.0049-0.02220.0070-0.0134-0.0234-0.03130.0329-0.01850.01470.0100
3 BMI-0.02740.00751.0000-0.01820.0060-0.0058-0.0251-0.0070-0.07330.00810.0375-0.00440.0585-0.00170.00260.03430.0029-0.06050.0088-0.0257-0.0044-0.0169-0.01810.0376-0.0103-0.0215-0.0439-0.0238-0.0565
4 Fever-0.02130.0231-0.01821.0000-0.00530.0213-0.0183-0.0050-0.0036-0.0169-0.0332-0.0292-0.0201-0.0032-0.01160.0216-0.0159-0.0219-0.0487-0.00690.00310.0015-0.00200.00170.02170.0182-0.0105-0.0317-0.0298
5 NauseaVomting-0.0241-0.03830.0060-0.00531.00000.02110.0079-0.03110.0022-0.0486-0.02210.01630.01420.08760.00960.0557-0.0228-0.02460.03820.0227-0.04370.00740.0069-0.0279-0.01120.00250.0198-0.05590.0550
6 Headache0.0182-0.0241-0.00580.02130.02111.00000.0196-0.01670.00800.0268-0.04300.0111-0.0290-0.02480.00550.05110.03800.02630.01090.0145-0.0059-0.00520.0242-0.04330.0116-0.0072-0.0079-0.0111-0.0027
7 Diarrhea0.04540.0153-0.0251-0.01830.00790.01961.0000-0.0268-0.01670.02240.0004-0.0167-0.06050.0286-0.03620.06020.0224-0.0649-0.02340.0072-0.0207-0.00160.0184-0.0001-0.01220.01890.03940.0321-0.0076
8 Fatiguegeneralizedboneache-0.00730.0456-0.0070-0.0050-0.0311-0.0167-0.02681.00000.00800.0355-0.0152-0.0228-0.00140.00140.0231-0.0028-0.07270.0115-0.00090.0224-0.00690.0387-0.0046-0.05150.03300.01540.0064-0.02300.0139
9 Jaundice0.00880.0007-0.0733-0.00360.00220.0080-0.01670.00801.0000-0.02680.0285-0.0128-0.0238-0.03720.01670.01430.0157-0.0113-0.00730.01050.0187-0.01400.0281-0.01000.03210.03830.0245-0.00890.0197
10 Epigastricpain-0.0119-0.02380.0081-0.0169-0.04860.02680.02240.0355-0.02681.00000.0370-0.0373-0.0027-0.05200.03680.0229-0.06450.0157-0.01700.03440.00450.03460.00440.00390.03130.04970.0829-0.0066-0.0522
11 WBC0.01380.02760.0375-0.0332-0.0221-0.04300.0004-0.01520.02850.03701.00000.00800.0098-0.0156-0.0055-0.0367-0.0124-0.0050-0.0117-0.0428-0.01470.01810.01550.0202-0.0528-0.0185-0.04720.02900.0172
12 RBC-0.0003-0.0040-0.0044-0.02920.01630.0111-0.0167-0.0228-0.0128-0.03730.00801.00000.04300.03410.01590.0126-0.03140.01780.01230.0510-0.05680.00630.00620.0161-0.0613-0.0285-0.0066-0.01980.0077
13 HGB-0.01250.00000.0585-0.02010.0142-0.0290-0.0605-0.0014-0.0238-0.00270.00980.04301.0000-0.0087-0.0138-0.01620.0238-0.00640.0035-0.0376-0.0270-0.0198-0.0520-0.00230.0103-0.00460.00520.02100.0035
14 Plat-0.00240.0196-0.0017-0.00320.0876-0.02480.02860.0014-0.0372-0.0520-0.01560.0341-0.00871.0000-0.00310.0485-0.0250-0.0445-0.00140.0016-0.0066-0.0345-0.0411-0.04000.04890.03810.00560.0345-0.0175
heatmap(corr(dat.Variables,'rows','complete'),'XDisplayLabels',string(dat.Properties.VariableNames),'YDisplayLabels',string(dat.Properties.VariableNames));
% Correlation matrix
% Takes longer to execute
%corrplot(dat)
for i=1:27
[R1, pvalue1(i,1)]=corr(dat.(i),dat.(29));
pvalue2(i,1)=anova1(dat.(i),dat.(29),'off');
%[R3, pvalue3(i,1)]=corr(dat.(i),dat.(29),'type','Spearman'); for
%ordinal/categorical data only
end
% Boneferroni and Benjamin Hochberg method
p_corrected1=27*pvalue1;
fdr1=mafdr(pvalue1);
p_corrected2=27*pvalue2;
fdr2=mafdr(pvalue2);
%p_corrected3=27*pvalue3;
%fdr3=mafdr(pvalue3);
%table(pvalue1,p_corrected1,fdr1,pvalue2,p_corrected2,fdr2,pvalue3,p_corrected3,fdr3,'Rownames',dat.Properties.VariableNames(1:27),'VariableNames',{'Pearson_Pval','Pearson_Qval','Pearson_FDR','Anova_Pval','Anova_Qval','Anova_FDR','Spearman_Pval','Spearman_Qval','Spearman_FDR'})
table(pvalue1,p_corrected1,fdr1,pvalue2,p_corrected2,fdr2,'Rownames',dat.Properties.VariableNames(1:27),'VariableNames',{'Pearson_Pval','Pearson_Qval','Pearson_FDR','Anova_Pval','Anova_Qval','Anova_FDR'})
ans = 27×6 table
 Pearson_PvalPearson_QvalPearson_FDRAnova_PvalAnova_QvalAnova_FDR
1 Age0.492113.28790.61420.34489.30881.0000
2 Gender0.709519.15600.57550.10102.72790.6476
3 BMI0.03590.97050.58310.00940.25310.2404
4 Fever0.26877.25440.48430.549614.83931.0000
5 NauseaVomting0.04101.10790.33290.17654.76420.9048
6 Headache0.921624.88330.57510.987026.64860.9372
7 Diarrhea0.778221.01140.57390.743220.06750.9528
8 Fatiguegeneralizedboneache0.604516.32160.51620.615316.61190.9279
9 Jaundice0.464212.53290.62760.33288.98501.0000
10 Epigastricpain0.05261.41890.28420.08712.35220.7445
11 WBC0.524114.15080.50020.766520.69630.8933
12 RBC0.775620.94100.59920.644417.40000.9179
13 HGB0.896624.20820.58180.764320.63600.9331
14 Plat0.515113.90780.55710.591715.97611.0000

Unsupervised

PCA can be used for mixed data type as long as categorical variables can be reduced to binary variables without loss of data. Variables should also be normalized before using.
% PCA without normalization
X=HCV_arr(:,1:end-2)
X = 1381×27
0.4828 0 0.5385 0 1.0000 1.0000 0 1.0000 1.0000 0 1.0000 0.5100 0 0.2724 0.5843 0.9438 0.6292 0.4045 0.8315 0.4228 0.9593 0.9750 0.0338 0.4482 0.1707 0.4166 0.0384 0.8621 0 0.8462 1.0000 1.0000 1.0000 1.0000 0 0 0 0.1303 0.6695 0.4000 0.4384 0.8315 0.1124 0.6292 0.7640 0.8652 0 0 0 0.4755 0.5503 0 0.9103 0.6896 0.9310 0 0.7692 0 0 1.0000 0 1.0000 1.0000 1.0000 0.0735 0.6572 0.2000 0.7094 0.6742 0.7303 0.3146 0.1011 0.9101 0.7236 0.6911 0.6250 0.5498 0.6147 1.0000 0.4193 0.2997 0.8966 1.0000 0 1.0000 1.0000 1.0000 0 1.0000 1.0000 0 0.9653 0.0549 1.0000 0.2864 0.3034 0.7303 0.9213 0.6404 0.2921 0.5528 0.8862 0.6000 0.9637 0.9044 0 0 0 0.5517 1.0000 0.6154 0 0 1.0000 1.0000 0 0 1.0000 0.4768 0.4904 0.2000 0.9229 0.8989 0.8202 0.4607 0.9888 0.0674 0.7398 0.3902 0.8500 0.5338 0.0600 0.2110 0.4584 0.6248 0.4138 0 0.0769 0 0 1.0000 1.0000 1.0000 0 1.0000 0.8221 0.6589 0.4000 0.4187 0.6067 0.4944 0.1798 0.7079 0.6517 0.9512 0.2764 1.0000 0.4924 0.6302 0 0.4590 0.2506 0.4483 0 0.6154 1.0000 0 1.0000 1.0000 0 0 1.0000 0.4050 0.5315 0.4000 0.0389 0.1798 0.3258 0.3708 0.9888 0.4719 0.9756 0.3089 0.6250 0.9585 0.1918 0.0716 0.3405 0.6855 0.1724 1.0000 0.1538 1.0000 0 1.0000 0 1.0000 1.0000 0 0.1587 0.3732 0.4000 0.5471 0.7191 0.9551 0.8090 0.3933 0.1573 0.9593 0.7805 0.7000 0.8518 0.0859 0.1961 0.5547 0.0740 0.1379 0 0 1.0000 1.0000 0 0 0 0 0 0.3360 0.2611 0.6000 0.3841 0.4045 0.1124 0.6067 0.1461 0.0787 0.3333 0.4390 1.0000 0.1146 0.9345 0.1505 0.0781 0.9949 0.4483 1.0000 0.2308 1.0000 0 0 0 1.0000 0 1.0000 0.6902 0.2500 0.6000 0.8242 0.6517 0.6966 0.3034 0.1573 0.6292 0.4065 0.8049 0.5250 0.7797 0.4468 0 0 0
stage=HCV_arr(:,29)
stage = 1381×1
0.3333 1.0000 0 1.0000 0.6667 0.3333 0.3333 0.3333 0 0
[coeff,score,latent,~,explained]=pca(X);
explained
explained = 27×1
7.8728 7.6349 7.3168 7.2822 7.1734 6.9769 6.7102 6.4450 4.4477 3.3991
b=bar(explained)
b =
Bar with properties: BarLayout: 'grouped' BarWidth: 0.8000 FaceColor: [0 0.4470 0.7410] EdgeColor: [0 0 0] BaseValue: 0 XData: [1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27] YData: [7.8728 7.6349 7.3168 7.2822 7.1734 6.9769 6.7102 6.4450 4.4477 3.3991 2.9204 2.6990 2.6407 2.6087 2.5094 2.4840 2.4116 2.3982 2.3494 2.2584 2.1950 2.1192 1.6427 1.2741 1.2410 0.8659 0.1234] Show all properties
xtips = b.XEndPoints;
ytips = b.YEndPoints;
labels = string(round(b.YData,2));
text(xtips,ytips,labels,'HorizontalAlignment','center',...
'VerticalAlignment','bottom')
xlabel('Principal Component')
ylabel('Percentange of explained variances')
title('Principle Component Analysis')
gscatter(score(:,1),score(:,2),stage,'rkgb')
Consider 0.05 as the variance threshold
features=HCV.Properties.VariableNames(1:end-2)';
PC1=abs(coeff(:,1));
sortrows(table(features,PC1),'PC1','descend')
ans = 27×2 table
 featuresPC1
1'NauseaVomting'0.5587
2'Fatiguegeneralizedboneache'0.5562
3'Gender'0.4201
4'Epigastricpain'0.3312
5'Headache'0.2555
6'Diarrhea'0.1215
7'ALT4'0.0514
8'Plat'0.0401
9'RNAEOT'0.0382
10'Jaundice'0.0352
11'ALT1'0.0288
12'WBC'0.0284
13'ALT24'0.0280
14'RBC'0.0278
PC2=abs(coeff(:,2));
sortrows(table(features,PC2),'PC2','descend')
ans = 27×2 table
 featuresPC2
1'Epigastricpain'0.6733
2'Diarrhea'0.4349
3'Headache'0.3718
4'Gender'0.2903
5'Jaundice'0.2519
6'Fever'0.2106
7'NauseaVomting'0.1121
8'RNAEF'0.0827
9'Fatiguegeneralizedboneache'0.0437
10'RNAEOT'0.0386
11'HGB'0.0356
12'ALT1'0.0300
13'Plat'0.0281
14'Age'0.0204
PC3=abs(coeff(:,3));
sortrows(table(features,PC3),'PC3','descend')
ans = 27×2 table
 featuresPC3
1'Fever'0.6592
2'Gender'0.4542
3'Headache'0.4376
4'Diarrhea'0.3751
5'Epigastricpain'0.0925
6'HGB'0.0629
7'Fatiguegeneralizedboneache'0.0590
8'ALT1'0.0571
9'ALT24'0.0387
10'NauseaVomting'0.0357
11'RNAEOT'0.0286
12'WBC'0.0271
13'BMI'0.0245
14'AST1'0.0236
PC4=abs(coeff(:,4));
sortrows(table(features,PC4),'PC4','descend')
ans = 27×2 table
 featuresPC4
1'Diarrhea'0.5267
2'Jaundice'0.4861
3'Headache'0.4606
4'Gender'0.3342
5'Fever'0.2572
6'Fatiguegeneralizedboneache'0.2082
7'Epigastricpain'0.2062
8'Plat'0.0552
9'AST1'0.0362
10'BMI'0.0351
11'ALT12'0.0343
12'NauseaVomting'0.0303
13'RNA4'0.0241
14'RNAEF'0.0191
%K-means Clustering for continuous variables
k_values=2:5;
n=length(k_values);
s_score=zeros(n,1);
for i=1:n
idx=kmeans(HCV_arr(:,[1,3,11:end-2]),k_values(i));
s=silhouette(HCV_arr(:,[1,3,11:end-2]),idx);
s_score(i)=mean(s);
end
table(k_values',s_score)
ans = 4×2 table
 Var1s_score
120.1424
230.1173
340.0992
450.0949
k=find(s_score==max(s_score))+1
k = 2
idx=kmeans(HCV_arr(:,[1,3,11:end-2]),k);
silhouette(HCV_arr(:,[1,3,11:end-2]),idx)
Can be used to classify into advanced and moderate fibrosis
%K-mediods clustering for categorical variables
k_values=2:5;
n=length(k_values);
s_score=zeros(n,1);
for i=1:n
idx=kmedoids(HCV_arr(:,[2,4:10]),k_values(i));
s=silhouette(HCV_arr(:,[2,4:10]),idx);
s_score(i)=mean(s);
end
table(k_values',s_score)
ans = 4×2 table
 Var1s_score
120.1633
230.1518
340.1511
450.1544
k=find(s_score==max(s_score))+1
k = 2
idx=kmedoids(HCV_arr(:,[2,4:10]),k);
silhouette(HCV_arr(:,[2,4:10]),idx)

Supervised

idx=kmedoids(HCV_arr(:,[2,4:10]),4,"Distance","hamming");
silhouette(HCV_arr(:,[2,4:10]),idx)
idx=kmeans(HCV_arr(:,[1,3,11:end-2]),4);
silhouette(HCV_arr(:,[1,3,11:end-2]),idx)
idx=kmeans(HCV_arr,4,'Distance',"sqeuclidean");
silhouette(HCV_arr,idx)

Aim: Predict stages of fibrosis

Train and Test Data

x=HCV_arr(:,1:27);
y=HCV_arr(:,29);
[Xtrain, Ytrain, Xtest, Ytest]=trainTestSplit(x,y,0.7);

Multinomial Logistic Regression

[b1,dev1,stats1]=mnrfit(Xtrain,categorical(Ytrain))
b1 = 28×3
-0.1953 -0.8443 -0.9515 0.2118 0.5923 0.1500 -0.0224 -0.1728 0.3051 0.6876 0.8584 0.5700 0.1058 0.0884 0.1316 -0.4134 -0.2606 -0.0699 -0.0394 0.0834 -0.0980 0.0816 -0.0387 0.1832 -0.1052 0.2235 -0.0598 -0.1781 0.2239 -0.0928
dev1 = 2.5845e+03
stats1 = struct with fields:
beta: [28×3 double] dfe: 2817 sfit: 1.0147 s: 1 estdisp: 0 covb: [84×84 double] coeffcorr: [84×84 double] se: [28×3 double] t: [28×3 double] p: [28×3 double] resid: [967×4 double] residp: [967×4 double] residd: [967×1 double]
[val,pred1]=max(mnrval(b1,Xtest)')
val = 1×414
0.3511 0.4403 0.4777 0.4167 0.3284 0.3943 0.3315 0.4650 0.3022 0.4101 0.3884 0.3384 0.3986 0.4569 0.2635 0.3430 0.3275 0.3445 0.3295 0.3976 0.3175 0.4002 0.2898 0.2828 0.3060 0.3841 0.3760 0.3361 0.3321 0.3391 0.3499 0.3434 0.3215 0.3527 0.3868 0.3157 0.3163 0.4278 0.3411 0.3833 0.4077 0.3312 0.3146 0.3648 0.3859 0.2783 0.3162 0.3051 0.2943 0.3410
pred1 = 1×414
4 4 4 3 1 4 4 3 1 4 4 2 2 3 4 2 4 4 3 3 1 3 4 1 3 1 1 4 2 4 4 1 3 4 2 4 3 3 1 4 2 2 1 2 2 3 4 3 1 2
pred1=normalize(pred1','range')
pred1 = 414×1
1.0000 1.0000 1.0000 0.6667 0 1.0000 1.0000 0.6667 0 1.0000
accuracy1=sum(pred1==Ytest)/length(Ytest)
accuracy1 = 0.0773
confusionmat(double(Ytest),double(pred1))
Error using confusionmat (line 66)
Class List in given inputs are different

5 fold cross validation

fold=5;
index1=crossvalind('kfold',size(Xtrain,1),fold);
accuracy1=zeros(1,fold);
%Model 1
for i=1:fold
test=(index1==i);
train=~test;
Xtrain=x(train,:);
Xtest=x(test,:);
Ytrain=y(train,:);
Ytest=y(test,:);
%Multinomial Logistic Regression
[b1,dev1,stats1]=mnrfit(Xtrain,categorical(Ytrain));
[~,pred1]=max(mnrval(b1,Xtest)');
pred1=normalize(pred1','range');
accuracy1(i)=sum(pred1==Ytest)/length(Ytest)
confusionmat(double(Ytest),double(pred1))
end
accuracy1 = 1×5
0.2642 0 0 0 0
Class List in given sample 0 0.3333 0.6667 1.0000 Total Instance = 193 class1==>0 class2==>0.33333 class3==>0.66667 class4==>1 Confusion Matrix predict_class1 predict_class2 predict_class3 predict_class4 ______________ ______________ ______________ ______________ Actual_class1 11 7 7 21 Actual_class2 13 7 7 10 Actual_class3 18 12 11 25 Actual_class4 10 4 8 22 Multi-Class Confusion Matrix Output TruePositive FalsePositive FalseNegative TrueNegative ____________ _____________ _____________ ____________ Actual_class1 11 41 35 106 Actual_class2 7 23 30 133 Actual_class3 11 22 55 105 Actual_class4 22 56 22 93 Class Accuracy Error Sensitivity Specificity Precision FalsePositiveRate F1_score MatthewsCorrelationCoefficient Kappa ____________________ ________ _______ ___________ ___________ _________ _________________ ________ ______________________________ _______ {'class1==>0' } 0.056995 0.94301 0.23913 0.72109 0.21154 0.27891 0.22449 0.038204 0.59773 {'class2==>0.33333'} 0.036269 0.96373 0.18919 0.85256 0.23333 0.14744 0.20896 0.045363 0.70163 {'class3==>0.66667'} 0.056995 0.94301 0.16667 0.82677 0.33333 0.17323 0.22222 0.0082675 0.58005 {'class4==>1' } 0.11399 0.88601 0.5 0.62416 0.28205 0.37584 0.36066 0.10615 0.49453 Over all valuses Accuracy: 0.2642 Error: 0.7358 Sensitivity: 0.2737 Specificity: 0.7561 Precision: 0.2651 FalsePositiveRate: 0.2439 F1_score: 0.2541 MatthewsCorrelationCoefficient: 0.0495 Kappa: 0.4903
ans = 4×4
11 7 7 21 13 7 7 10 18 12 11 25 10 4 8 22
accuracy1 = 1×5
0.2642 0.1753 0 0 0
Class List in given sample 0 0.3333 0.6667 1.0000 Total Instance = 194 class1==>0 class2==>0.33333 class3==>0.66667 class4==>1 Confusion Matrix predict_class1 predict_class2 predict_class3 predict_class4 ______________ ______________ ______________ ______________ Actual_class1 11 7 11 21 Actual_class2 12 5 16 14 Actual_class3 19 12 4 14 Actual_class4 14 6 14 14 Multi-Class Confusion Matrix Output TruePositive FalsePositive FalseNegative TrueNegative ____________ _____________ _____________ ____________ Actual_class1 11 45 39 99 Actual_class2 5 25 42 122 Actual_class3 4 41 45 104 Actual_class4 14 49 34 97 Class Accuracy Error Sensitivity Specificity Precision FalsePositiveRate F1_score MatthewsCorrelationCoefficient Kappa ____________________ ________ _______ ___________ ___________ _________ _________________ ________ ______________________________ _______ {'class1==>0' } 0.056701 0.9433 0.22 0.6875 0.19643 0.3125 0.20755 0.089284 0.5785 {'class2==>0.33333'} 0.025773 0.97423 0.10638 0.82993 0.16667 0.17007 0.12987 0.075468 0.6695 {'class3==>0.66667'} 0.020619 0.97938 0.081633 0.71724 0.088889 0.28276 0.085106 0.20704 0.62491 {'class4==>1' } 0.072165 0.92784 0.29167 0.66438 0.22222 0.33562 0.25225 0.040499 0.55653 Over all valuses Accuracy: 0.1753 Error: 0.8247 Sensitivity: 0.1749 Specificity: 0.7248 Precision: 0.1686 FalsePositiveRate: 0.2752 F1_score: 0.1687 MatthewsCorrelationCoefficient: 0.1031 Kappa: 0.5453
ans = 4×4
11 7 11 21 12 5 16 14 19 12 4 14 14 6 14 14
accuracy1 = 1×5
0.2642 0.1753 0.2371 0 0
Class List in given sample 0 0.3333 0.6667 1.0000 Total Instance = 194 class1==>0 class2==>0.33333 class3==>0.66667 class4==>1 Confusion Matrix predict_class1 predict_class2 predict_class3 predict_class4 ______________ ______________ ______________ ______________ Actual_class1 11 10 16 10 Actual_class2 9 10 18 10 Actual_class3 7 7 11 12 Actual_class4 20 15 14 14 Multi-Class Confusion Matrix Output TruePositive FalsePositive FalseNegative TrueNegative ____________ _____________ _____________ ____________ Actual_class1 11 36 36 111 Actual_class2 10 32 37 115 Actual_class3 11 48 26 109 Actual_class4 14 32 49 99 Class Accuracy Error Sensitivity Specificity Precision FalsePositiveRate F1_score MatthewsCorrelationCoefficient Kappa ____________________ ________ _______ ___________ ___________ _________ _________________ ________ ______________________________ _______ {'class1==>0' } 0.056701 0.9433 0.23404 0.7551 0.23404 0.2449 0.23404 0.010855 0.61078 {'class2==>0.33333'} 0.051546 0.94845 0.21277 0.78231 0.2381 0.21769 0.22472 0.0051195 0.6269 {'class3==>0.66667'} 0.056701 0.9433 0.2973 0.69427 0.18644 0.30573 0.22917 0.0072036 0.59839 {'class4==>1' } 0.072165 0.92784 0.22222 0.75573 0.30435 0.24427 0.25688 0.02428 0.56042 Over all valuses Accuracy: 0.2371 Error: 0.7629 Sensitivity: 0.2416 Specificity: 0.7469 Precision: 0.2407 FalsePositiveRate: 0.2531 F1_score: 0.2362 MatthewsCorrelationCoefficient: 0.0119 Kappa: 0.5084
ans = 4×4
11 10 16 10 9 10 18 10 7 7 11 12 20 15 14 14
accuracy1 = 1×5
0.2642 0.1753 0.2371 0.2021 0
Class List in given sample 0 0.3333 0.6667 1.0000 Total Instance = 193 class1==>0 class2==>0.33333 class3==>0.66667 class4==>1 Confusion Matrix predict_class1 predict_class2 predict_class3 predict_class4 ______________ ______________ ______________ ______________ Actual_class1 8 7 24 14 Actual_class2 12 7 7 16 Actual_class3 13 9 7 16 Actual_class4 7 12 17 17 Multi-Class Confusion Matrix Output TruePositive FalsePositive FalseNegative TrueNegative ____________ _____________ _____________ ____________ Actual_class1 8 32 45 108 Actual_class2 7 28 35 123 Actual_class3 7 48 38 100 Actual_class4 17 46 36 94 Class Accuracy Error Sensitivity Specificity Precision FalsePositiveRate F1_score MatthewsCorrelationCoefficient Kappa ____________________ ________ _______ ___________ ___________ _________ _________________ ________ ______________________________ _______ {'class1==>0' } 0.041451 0.95855 0.15094 0.77143 0.2 0.22857 0.17204 0.085476 0.61605 {'class2==>0.33333'} 0.036269 0.96373 0.16667 0.81457 0.2 0.18543 0.18182 0.020094 0.66792 {'class3==>0.66667'} 0.036269 0.96373 0.15556 0.67568 0.12727 0.32432 0.14 0.15809 0.60026 {'class4==>1' } 0.088083 0.91192 0.32075 0.67143 0.26984 0.32857 0.2931 0.0074402 0.53751 Over all valuses Accuracy: 0.2021 Error: 0.7979 Sensitivity: 0.1985 Specificity: 0.7333 Precision: 0.1993 FalsePositiveRate: 0.2667 F1_score: 0.1967 MatthewsCorrelationCoefficient: 0.0678 Kappa: 0.5300
ans = 4×4
8 7 24 14 12 7 7 16 13 9 7 16 7 12 17 17
accuracy1 = 1×5
0.2642 0.1753 0.2371 0.2021 0.2487
Class List in given sample 0 0.3333 0.6667 1.0000 Total Instance = 193 class1==>0 class2==>0.33333 class3==>0.66667 class4==>1 Confusion Matrix predict_class1 predict_class2 predict_class3 predict_class4 ______________ ______________ ______________ ______________ Actual_class1 9 14 13 13 Actual_class2 10 9 14 16 Actual_class3 17 8 7 18 Actual_class4 8 4 10 23 Multi-Class Confusion Matrix Output TruePositive FalsePositive FalseNegative TrueNegative ____________ _____________ _____________ ____________ Actual_class1 9 35 40 109 Actual_class2 9 26 40 118 Actual_class3 7 37 43 106 Actual_class4 23 47 22 101 Class Accuracy Error Sensitivity Specificity Precision FalsePositiveRate F1_score MatthewsCorrelationCoefficient Kappa ____________________ ________ _______ ___________ ___________ _________ _________________ ________ ______________________________ _______ {'class1==>0' } 0.046632 0.95337 0.18367 0.75694 0.20455 0.24306 0.19355 0.061605 0.61599 {'class2==>0.33333'} 0.046632 0.95337 0.18367 0.81944 0.25714 0.18056 0.21429 0.0035219 0.64007 {'class3==>0.66667'} 0.036269 0.96373 0.14 0.74126 0.15909 0.25874 0.14894 0.124 0.61719 {'class4==>1' } 0.11917 0.88083 0.51111 0.68243 0.32857 0.31757 0.4 0.17022 0.51554 Over all valuses Accuracy: 0.2487 Error: 0.7513 Sensitivity: 0.2546 Specificity: 0.7500 Precision: 0.2373 FalsePositiveRate: 0.2500 F1_score: 0.2392 MatthewsCorrelationCoefficient: 0.0898 Kappa: 0.5009
ans = 4×4
9 14 13 13 10 9 14 16 17 8 7 18 8 4 10 23
avg_accuracy_5=mean(accuracy1)
avg_accuracy_5 = 0.2255

Aim: Predict Advanced fibrosis

Train and Test Data

x=HCV_arr(:,1:27);
y=HCV_array(:,29)>2;
y = 1381×1 logical array
0 1 0 1 1 0 0 0 0 0
[Xtrain, Ytrain, Xtest, Ytest]=trainTestSplit(x,y,0.7);

Multinomial Logistic Regression

[b1,dev1,stats1]=mnrfit(Xtrain,categorical(Ytrain))
b1 = 28×1
-0.3226 0.3322 -0.1850 0.5297 0.0117 -0.1470 -0.0704 0.0176 -0.0642 0.0563
dev1 = 1.3139e+03
stats1 = struct with fields:
beta: [28×1 double] dfe: 939 sfit: 1.0148 s: 1 estdisp: 0 covb: [28×28 double] coeffcorr: [28×28 double] se: [28×1 double] t: [28×1 double] p: [28×1 double] resid: [967×2 double] residp: [967×2 double] residd: [967×1 double]
[val,pred1]=max(mnrval(b1,Xtest)')
val = 1×414
0.5851 0.5757 0.5320 0.5516 0.6350 0.6413 0.5259 0.5919 0.6021 0.6115 0.5524 0.5382 0.5062 0.5550 0.5170 0.5764 0.5766 0.6444 0.5531 0.6230 0.5541 0.5732 0.5799 0.5638 0.5393 0.5111 0.5296 0.5758 0.5595 0.5557 0.5056 0.5024 0.5596 0.5238 0.5090 0.5815 0.5141 0.5064 0.5254 0.5931 0.5395 0.5364 0.5369 0.5884 0.5800 0.5808 0.5296 0.5383 0.5902 0.5040
pred1 = 1×414
2 2 2 2 2 1 2 2 2 1 2 2 2 1 1 1 1 1 2 2 2 2 2 2 2 1 2 2 2 2 1 1 1 1 1 2 2 2 2 2 1 1 2 2 1 1 2 1 1 2
pred1=normalize(pred1','range')
pred1 = 414×1
1 1 1 1 1 0 1 1 1 0
accuracy1=sum(pred1==Ytest)/length(Ytest)
accuracy1 = 0.5483
confusionmat(double(Ytest),double(pred1))
Class List in given sample 0 1 Total Instance = 414 class1==>0 class2==>1 Confusion Matrix predict_class1 predict_class2 ______________ ______________ Actual_class1 91 109 Actual_class2 78 136 Two-Class Confution Matrix '' 'TruePositive' 'FalsePositive' 'FalseNegative' [ 91] [ 109] 'TrueNegative=TN' [ 78] [ 136] Over all valuses Accuracy: 0.5483 Error: 0.4517 Sensitivity: 0.4550 Specificity: 0.6355 Precision: 0.5385 FalsePositiveRate: 0.3645 F1_score: 0.4932 MatthewsCorrelationCoefficient: 0.0920 Kappa: 0.0910
ans = 2×2
91 109 78 136

5 fold cross validation

fold=5;
index1=crossvalind('kfold',size(Xtrain,1),fold);
accuracy1=zeros(1,fold);
accuracy2=zeros(1,fold);
precision2=zeros(1,fold);
recall2=zeros(1,fold);
%Model 1
for i=1:fold
test=(index1==i);
train=~test;
Xtrain=x(train,:);
Xtest=x(test,:);
Ytrain=y(train,:);
Ytest=y(test,:);
%Multinomial Logistic Regression
[b1,dev1,stats1]=mnrfit(Xtrain,categorical(Ytrain));
[~,pred1]=max(mnrval(b1,Xtest)');
pred1=normalize(pred1','range');
accuracy1(i)=sum(pred1==Ytest)/length(Ytest);
confusionmat(double(Ytest),double(pred1))
%Linear Regression
b2=fitlm(Xtrain,Ytrain);
pred2=round(predict(b2,Xtest));
accuracy2(i)=sum(pred2==Ytest)/length(Ytest);
precision2(i)=sum(pred2==1 & Ytest==1)/sum(pred2==1);
recall2(i)=sum(pred2==1 & Ytest==1)/sum(Ytest==1);
end
Class List in given sample 0 1 Total Instance = 194 class1==>0 class2==>1 Confusion Matrix predict_class1 predict_class2 ______________ ______________ Actual_class1 38 55 Actual_class2 43 58 Two-Class Confution Matrix '' 'TruePositive' 'FalsePositive' 'FalseNegative' [ 38] [ 55] 'TrueNegative=TN' [ 43] [ 58] Over all valuses Accuracy: 0.4948 Error: 0.5052 Sensitivity: 0.4086 Specificity: 0.5743 Precision: 0.4691 FalsePositiveRate: 0.4257 F1_score: 0.4368 MatthewsCorrelationCoefficient: 0.0174 Kappa: 0.0169
ans = 2×2
38 55 43 58
Class List in given sample 0 1 Total Instance = 193 class1==>0 class2==>1 Confusion Matrix predict_class1 predict_class2 ______________ ______________ Actual_class1 41 48 Actual_class2 44 60 Two-Class Confution Matrix '' 'TruePositive' 'FalsePositive' 'FalseNegative' [ 41] [ 48] 'TrueNegative=TN' [ 44] [ 60] Over all valuses Accuracy: 0.5233 Error: 0.4767 Sensitivity: 0.4607 Specificity: 0.5769 Precision: 0.4824 FalsePositiveRate: 0.4231 F1_score: 0.4713 MatthewsCorrelationCoefficient: 0.0378 Kappa: 0.0377
ans = 2×2
41 48 44 60
Class List in given sample 0 1 Total Instance = 193 class1==>0 class2==>1 Confusion Matrix predict_class1 predict_class2 ______________ ______________ Actual_class1 37 62 Actual_class2 40 54 Two-Class Confution Matrix '' 'TruePositive' 'FalsePositive' 'FalseNegative' [ 37] [ 62] 'TrueNegative=TN' [ 40] [ 54] Over all valuses Accuracy: 0.4715 Error: 0.5285 Sensitivity: 0.3737 Specificity: 0.5745 Precision: 0.4805 FalsePositiveRate: 0.4255 F1_score: 0.4205 MatthewsCorrelationCoefficient: 0.0529 Kappa: 0.0490
ans = 2×2
37 62 40 54
Class List in given sample 0 1 Total Instance = 194 class1==>0 class2==>1 Confusion Matrix predict_class1 predict_class2 ______________ ______________ Actual_class1 42 58 Actual_class2 30 64 Two-Class Confution Matrix '' 'TruePositive' 'FalsePositive' 'FalseNegative' [ 42] [ 58] 'TrueNegative=TN' [ 30] [ 64] Over all valuses Accuracy: 0.5464 Error: 0.4536 Sensitivity: 0.4200 Specificity: 0.6809 Precision: 0.5833 FalsePositiveRate: 0.3191 F1_score: 0.4884 MatthewsCorrelationCoefficient: 0.1043 Kappa: 0.1000
ans = 2×2
42 58 30 64
Class List in given sample 0 1 Total Instance = 193 class1==>0 class2==>1 Confusion Matrix predict_class1 predict_class2 ______________ ______________ Actual_class1 45 41 Actual_class2 49 58 Two-Class Confution Matrix '' 'TruePositive' 'FalsePositive' 'FalseNegative' [ 45] [ 41] 'TrueNegative=TN' [ 49] [ 58] Over all valuses Accuracy: 0.5337 Error: 0.4663 Sensitivity: 0.5233 Specificity: 0.5421 Precision: 0.4787 FalsePositiveRate: 0.4579 F1_score: 0.5000 MatthewsCorrelationCoefficient: 0.0649 Kappa: 0.0647
ans = 2×2
45 41 49 58
avg_accuracy_multinom=mean(accuracy1)
avg_accuracy_multinom = 0.5139
avg_accuracy_lm=mean(accuracy2)
avg_accuracy_lm = 0.5150
avg_precision_lm=mean(precision2)
avg_precision_lm = 0.5302
avg_recall_lm=mean(recall2)
avg_recall_lm = 0.5897

Selecting subset of features

mdl1=fitlm(HCV_arr(:,[1:27]),HCV_array(:,29)>2,'CategoricalVars',{'x2','x4','x5','x6','x7','x8','x9','x10'})
mdl1 =
Linear regression model: y ~ 1 + x1 + x2 + x3 + x4 + x5 + x6 + x7 + x8 + x9 + x10 + x11 + x12 + x13 + x14 + x15 + x16 + x17 + x18 + x19 + x20 + x21 + x22 + x23 + x24 + x25 + x26 + x27 Estimated Coefficients: Estimate SE tStat pValue __________ ________ ________ __________ (Intercept) 0.5414 0.12464 4.3437 1.5058e-05 x1 -0.042526 0.044582 -0.95389 0.34031 x2_1 0.041713 0.027045 1.5424 0.12322 x3 -0.10922 0.043299 -2.5225 0.011767 x4_1 -0.02126 0.026998 -0.78744 0.43116 x5_1 0.056584 0.027178 2.082 0.03753 x6_1 -0.0028242 0.027049 -0.10441 0.91686 x7_1 -0.0049068 0.027175 -0.18057 0.85673 x8_1 -0.0049303 0.027096 -0.18195 0.85565 x9_1 -0.0095379 0.027047 -0.35264 0.72442 x10_1 -0.050246 0.027259 -1.8433 0.065504 x11 0.027231 0.046256 0.5887 0.55616 x12 -0.017758 0.047097 -0.37705 0.7062 x13 0.0061583 0.039533 0.15578 0.87623 x14 -0.048983 0.046852 -1.0455 0.29599 x15 -0.026526 0.046228 -0.5738 0.5662 x16 0.042441 0.046723 0.90835 0.36386 x17 -0.017705 0.045606 -0.38822 0.69792 x18 0.0090752 0.046288 0.19606 0.84459 x19 -0.020025 0.046095 -0.43443 0.66404 x20 0.0014998 0.063123 0.023759 0.98105 x21 -0.037244 0.063803 -0.58372 0.5595 x22 0.09688 0.076792 1.2616 0.20731 x23 0.057567 0.045829 1.2561 0.20929 x24 -0.043943 0.044944 -0.97774 0.32838 x25 0.20058 0.20293 0.98839 0.32314 x26 -0.081096 0.04809 -1.6864 0.091959 x27 0.095518 0.04791 1.9937 0.046386 Number of observations: 1381, Error degrees of freedom: 1353 Root Mean Squared Error: 0.499 R-squared: 0.0231, Adjusted R-Squared: 0.00363 F-statistic vs. constant model: 1.19, p-value = 0.234
newmdl1=step(mdl1)
1. Adding x6:x13, FStat = 6.9063, pValue = 0.0086867
newmdl1 =
Linear regression model: y ~ 1 + x1 + x2 + x3 + x4 + x5 + x7 + x8 + x9 + x10 + x11 + x12 + x14 + x15 + x16 + x17 + x18 + x19 + x20 + x21 + x22 + x23 + x24 + x25 + x26 + x27 + x6*x13 Estimated Coefficients: Estimate SE tStat pValue __________ ________ ________ __________ (Intercept) 0.47374 0.12701 3.73 0.00019938 x1 -0.039696 0.044498 -0.89207 0.37251 x2_1 0.043959 0.027 1.6281 0.10373 x3 -0.10663 0.043216 -2.4674 0.013731 x4_1 -0.018319 0.026963 -0.67941 0.49699 x5_1 0.056083 0.027119 2.068 0.03883 x6_1 0.10507 0.049133 2.1385 0.032658 x7_1 -0.0040499 0.027117 -0.14935 0.8813 x8_1 -0.0062626 0.027042 -0.23159 0.81689 x9_1 -0.0098807 0.026989 -0.3661 0.71434 x10_1 -0.050839 0.0272 -1.8691 0.061829 x11 0.031257 0.046181 0.67683 0.49863 x12 -0.013836 0.047018 -0.29427 0.76859 x13 0.10932 0.05565 1.9644 0.049693 x14 -0.052785 0.046773 -1.1285 0.25929 x15 -0.016298 0.046292 -0.35206 0.72485 x16 0.041809 0.046622 0.89677 0.37 x17 -0.022107 0.045537 -0.48546 0.62743 x18 0.0060941 0.046201 0.1319 0.89508 x19 -0.023943 0.046019 -0.5203 0.60294 x20 0.0087626 0.063046 0.13899 0.88948 x21 -0.029889 0.063726 -0.46903 0.63913 x22 0.091319 0.076654 1.1913 0.23374 x23 0.057315 0.04573 1.2533 0.2103 x24 -0.041882 0.044853 -0.93376 0.3506 x25 0.20109 0.20249 0.99307 0.32086 x26 -0.077427 0.048005 -1.6129 0.107 x27 0.095373 0.047806 1.995 0.046243 x6_1:x13 -0.208 0.07915 -2.628 0.0086867 Number of observations: 1381, Error degrees of freedom: 1352 Root Mean Squared Error: 0.498 R-squared: 0.0281, Adjusted R-Squared: 0.00796 F-statistic vs. constant model: 1.4, p-value = 0.0828
mdls=stepwiselm(HCV_arr(:,[1:27]),HCV_array(:,29)>2)
1. Adding x3, FStat = 6.613, pValue = 0.010228 2. Adding x5, FStat = 4.8159, pValue = 0.028364
mdls =
Linear regression model: y ~ 1 + x3 + x5 Estimated Coefficients: Estimate SE tStat pValue ________ ________ _______ __________ (Intercept) 0.54435 0.02881 18.894 5.5214e-71 x3 -0.1107 0.042771 -2.5882 0.0097499 x5 0.058839 0.026812 2.1945 0.028364 Number of observations: 1381, Error degrees of freedom: 1378 Root Mean Squared Error: 0.498 R-squared: 0.00824, Adjusted R-Squared: 0.0068 F-statistic vs. constant model: 5.72, p-value = 0.00335
m=fitlm(Xtrain(:,[3,5]),Ytrain)
m =
Linear regression model: y ~ 1 + x1 + x2 Estimated Coefficients: Estimate SE tStat pValue _________ ________ ______ __________ (Intercept) 0.51172 0.038815 13.184 6.1446e-36 x1 -0.085468 0.057401 -1.489 0.1369 x2 0.080471 0.035843 2.2451 0.025047 Number of observations: 774, Error degrees of freedom: 771 Root Mean Squared Error: 0.499 R-squared: 0.00928, Adjusted R-Squared: 0.00671 F-statistic vs. constant model: 3.61, p-value = 0.0275
p=round(predict(m,Xtest(:,[3,5])))
p = 193×1
1 1 1 0 1 1 0 0 1 1
acc=sum(pred2==Ytest)/length(Ytest)
acc = 0.5389
prec=sum(pred2==1 & Ytest==1)/sum(pred2==1)
prec = 0.5918
rec=sum(pred2==1 & Ytest==1)/sum(Ytest==1)
rec = 0.5421
Accuracy, Precision and Recall increase